« All Episodes: Gigaom AI Minute – January 11

-
-
0:00
0:00
0:00

How do you teach a machine rules? There is of course no data set on which to train it. There's no agreed upon scripture, or even if there were, there's no agreed upon interpretation of that scripture. There is no single moral precept that can be universally applied in all situations. And even if a single person wanted to teach the AI their own values, their own moral code, there are so many exceptions and provisos and so many things that are situational that it probably can't be done. So if this is all true, then a machine can never be a moral agent, and it will always be the human behind the machine.