Get Ready to Control Your Android Phone With the Wave of a Hand

9 Comments

Google Android (s goog) handsets with front-facing cameras can now be controlled using simple hand gestures, courtesy of eyeSight Mobile Technologies. The privately held Israeli company uses a phone’s camera to monitor for gestures, which are then software-interpreted into commands such as map zooming, digital audio playback or phone call receipt.

eyeSight is providing its solution to Android device manufacturers with the expectation that — like the recently launched Android-powered HTC EVO, which is setting sales records for Sprint (s s) — future handsets will also sport a front-facing camera. As Itay Katz, eyeSight’s founder and CEO, explains in a release: “Users are looking for ways to ease, improve and enjoy their day-to-day interaction with their mobile phone, ideally aiming to gain effortless control of the device’s applications and functions, which is where eyeSight’s solution comes to place.” Google (s goog) must see the same potential: It recently spent an estimated $35-$40 million on BumpTop, maker of a 3-D natural user interface that could find its way to Android or Google’s Chrome OS environment.

The eyeSight product reminds me of a conversation I recently had with my 12-year-old son after watching “Bicentennial Man,” a movie about an android that wants to become human. I asked my son if he thought we would have natural-acting robots in his lifetime, and he said no, because the technology for robots to use senses like a human doesn’t exist for consumers. That launched a long discussion on how device cameras can be used as “eyes” — just as eyeSight is doing — and how microphones can be “ears.” We have robust technology for devices to sense their surroundings, but interpreting what the surroundings mean at the consumer device level is still maturing and cost prohibitive. For now, waving your hand in front of an eyeSight-enabled Android device will have to do.

Related content on GigaOM Pro (subscription required):

Cool, Calm and Connected: 3 Design Principles for Connected Objects

9 Comments

Kobkrit

When you driving, using the hand gesture used for controlling mobile is brilliant idea, but if the information still displays on the mobile screen, you still need to take a look on the mobile phone’s display screen that it is still not a safe driving. This innovation must integrated with efficient Text-to-Speech(TTS) system that can avoiding any looking to the mobile phone which is a safer driving.

Ray

A few years ago I saw videos of a researcher pushing a virtual ball around on a screen with hand gestures. The experiments were filmed in the 1970’s.

The issue here isn’t the technology, which has been around for decades, it’s finding a problem for which the technology is the best among competing solutions. I can see something like Microsoft’s Natal taking off, especially in the game arena. But having my cell phone recognized gestures? It’s hard to see what it’ll enable me to do that I can do with a touch screen now.

Sanjay Maharaj

Very interesting innovation, my questions is that does it really have enought value in it and how realible will it be. It will have to be 110% full proof to gain traction and adoption

Comments are closed.