Gigaom brings you our unique analysis and commentary on the present and future of AI.
As I've said on a number of these AI Minutes, we don't understand the human brain. We don't know how a thought is encoded and retrieved, but with regard to artificial intelligence that may be okay. We may not need to understand the brain to build an AI. But, we also don't understand how it is that we are intelligent. We don't know how our brain is able to produce things like creativity, and humor, and all of things that seem to be beyond what an organ should be able to do. But again, maybe it's okay with regard to artificial intelligence that we don't understand how it is that we are intelligent.
However, it's worse than that because we don't understand what intelligence is. I think that can be demonstrated by the fact that there's no consensus definition on what it is. There are dozens, if not hundreds, of different takes on what is intelligence.
And because of that, it seems highly unlikely to me that we will ever design an artificial general intelligence. Without understanding what intelligence is and how we are intelligent, the notion that we can actually design and build one seems beyond us. Now, we may be able to evolve one. We may be able to teach it to teach itself how. But the idea that somehow we can somehow build an artificial general intelligence, I find to be unproven.