Merlin
Can A.I. Be Taught to Explain Itself?
Mon Dec 4, 2017 1:46pm
24.228.93.156

As machine learning becomes more powerful, the field’s researchers increasingly

find themselves unable to account for what their algorithms know — or how they know it.

https://nyti.ms/2hR1S15

I found this article interesting, but too long to post.

In math class, we used to have to show our work, not merely have the correct answer.

But then, the instructor knew what the correct answer is.

In this case, the AI developers sometimes do not know what the answer is, or how the program arrived at it's answer.

Hard to believe that AI trading programs are making billion dollar investment decisions but the user or developer don't know why the AI made the decisions it made to buy or sell.

We may have outsmarted ourselves.