Microsoft’s Xbox Kinect accessory might have the spotlight when it comes to gesture technology, but it’s not the only game in town. EyeSight, an Israel-based firm, was founded back in 2004 and focuses solely on touch-free interfaces for mobile devices. Tomorrow, the company will announce support for mobile computers and tablets running Google Android or Microsoft Windows: using the integrated front-facing camera on a device, one can control and navigate the user interface with the wave of a hand.
According to eyeSight, its Touch Free technology is ideal for activities that until now have been shackled to the keyboard. Gestures are suited for remotely controlling media playback, page turns for e-book reading, slideshow presentations and other navigation tasks that can be handled through gestures. The eyeSight solution is clever in that requires no additional hardware as integrated cameras are now a staple of netbooks, notebooks, tablets and other handheld devices.
EyeSight’s Touch Free solution leverages existing hardware by using software for real-time image processing and algorithms to interpret gestures. The company claims the technology has a small footprint and minimal hit to the device processor, which is important in terms of overall performance and battery life: gesture navigation is great, but if it slows down a computer or causes the battery to run down twice as fast, it’s not likely going to face adoption. And for gesture controls to gain traction, we need to see more natural computing user interfaces in the future.
In terms of that traction, I asked the company about its distribution model because I have both an Android tablet and an addiction to making myself look stupid in front of a gesture-based device like the Kinect. Some Symbian S60 smartphones support Ovi Store games that leverage eyeSight’s technology, but the company doesn’t currently offer its software for direct download to individual consumers. Instead, eyeSight is trying to crack the device manufacturer market by selling technology licenses. These could be used to fully integrate eyeSight into a device and bring gestures into the entire user experience of a notebook, tablet or handset.
Given the current interest in gesture based gaming and navigation, now might be a good time for eyeSight to build buzz and make its software available directly to consumers. I know I’d wave hello to it on my Android devices to see how well it worked. Would you?
Related content on GigaOM Pro (subscription required):