Google Android (s goog) handsets with front-facing cameras can now be controlled using simple hand gestures, courtesy of eyeSight Mobile Technologies. The privately held Israeli company uses a phone’s camera to monitor for gestures, which are then software-interpreted into commands such as map zooming, digital audio playback or phone call receipt.
eyeSight is providing its solution to Android device manufacturers with the expectation that — like the recently launched Android-powered HTC EVO, which is setting sales records for Sprint (s s) — future handsets will also sport a front-facing camera. As Itay Katz, eyeSight’s founder and CEO, explains in a release: “Users are looking for ways to ease, improve and enjoy their day-to-day interaction with their mobile phone, ideally aiming to gain effortless control of the device’s applications and functions, which is where eyeSight’s solution comes to place.” Google (s goog) must see the same potential: It recently spent an estimated $35-$40 million on BumpTop, maker of a 3-D natural user interface that could find its way to Android or Google’s Chrome OS environment.
The eyeSight product reminds me of a conversation I recently had with my 12-year-old son after watching “Bicentennial Man,” a movie about an android that wants to become human. I asked my son if he thought we would have natural-acting robots in his lifetime, and he said no, because the technology for robots to use senses like a human doesn’t exist for consumers. That launched a long discussion on how device cameras can be used as “eyes” — just as eyeSight is doing — and how microphones can be “ears.” We have robust technology for devices to sense their surroundings, but interpreting what the surroundings mean at the consumer device level is still maturing and cost prohibitive. For now, waving your hand in front of an eyeSight-enabled Android device will have to do.
Related content on GigaOM Pro (subscription required):