Gigaom brings you our unique analysis and commentary on the present and future of AI.
I'm spending a few AI Minutes talking about common sense, and by that I mean things like you don't walk off a cliff. Now, if I were to ask you a question like, "If you put an ice cube in a box will it be there tomorrow?" you instantly know the answer to that, or have a pretty good guess, right? You realize the ice cube would melt and it would just be water in the box and it would seep out. You may ask questions about, "Is the temperature less than freezing outside where the box is and does the box have any way for the water to leak out?" and so force, but the key property of the ice, that it melts, is what goes into one's head right away. And yet, why?
If you think about it, that ice cube has a hundred, a thousand other properties. It has a certain color, a certain size, a certain weight, a certain temperature, it exists at a certain time in a certain place under a certain set of conditions, and all of the rest of it. And yet, our minds instantly go to the fact that the ice cube will melt at room temperature.
Maybe, in the back of our head, we're wondering why it's an ice cube you put in the box and not a set of car keys or a nickel. And maybe that is that thing about it that's different. But if you think about it, even that doesn't get us there. The set of car keys and the nickel, they aren't cold either, they aren't the same color as an ice cube. So why is it that we know that that's the thing about the ice cube in the box that matters? That, at its core, is the kind of question we don't know how to answer. And the kind of answer that limits our ability to make real progress toward a general intelligence.