The parade of new gesture technologies continues on with the latest being a compelling effort from Elliptic Labs, a six-year-old company in Norway that uses ultrasound to let your gestures control your handset.
The technology requires a few additional components inside the handsets — namely a transceiver the emits a high-pitched sound and up to three or four mics to hear the sounds said Haakon Bryhni, CTO of Elliptic Labs. The number of mics gives Elliptic coverage on all sides of the phone, and isn’t too far-fetched. The latest iPhone has 3 mics.
As a person moves their hand within about 2 feet of the phone the mics can hear sound waves being disrupted and thus the Elliptic software can tell the phone or the app where the hand is and what gesture it’s making. It’s kind of like this technology that does a similar gesture-recognition using Wi-Fi–only Elliptic’s is already commercialized. As for battery life, Bryhni didn’t supply any details only said that it’s not a power suck.
From there it’s the same as any other phone. Applications can take advantage of this capability using the Elliptic SDK to really showcase the gesture technology, or the user can run an Elliptic emulator that offers the same basic functionality the touch-based gestures do. Bryhni says that three handset makers are planning to offer the technology in their phones next year, and the company is looking for developers to build out cool apps to showcase the technology.
As the latest ads for the Moto X handsets, which features always-on voice control show us, we’re getting less enamored of picking up our handsets or even looking at our screens. Gesture-based UIs are also an intriguing option for interfacing with the array of connected devices that are being embedded into our homes, cars and clothing. Elliptic’s tech may start out in a handset and end up in a pair of connected glasses, or perhaps, it, plus a projector might allow our phones to project a bubble of compute around us.