Summary:

Gesture control will be integral in controlling the smart home, but it’s still up in the air how it might be implemented. NinjaBlocks uses EMF instead of a camera.

ninjasphere

As more companies investigate the myriad ways we will navigate ubiquitous computing in our homes, cars and workplaces, gesture is always one of the top ideas. While it won’t work everywhere, the idea of moving your hands to make something happen is a time-honored trope of sci-fi novels and a practical idea for many scenarios.

We’ve discussed using gesture in the operating room so surgeons don’t have to touch a keyboard and mess up a sterile environment; car manufacturers are experimenting with gesture-based controls for infotainment systems; and in the home we’re thinking about how to use gesture to control the environment. In most of these cases, understanding the gesture falls to a camera, touch or maybe a microphone, but research into other areas of computer vision are ongoing. A promising one is deciphering disruptions in wireless signals to decode a gesture.

I thought this was in the research stage, but actually for close-up applications, Microchip makes an electromagnetic field (EMF) detection component that lets people do a variation of this today, according to Pete More, a founder at Ninja Block whose futuristic home hub, the NinjaSphere, is about to complete its funding on Kickstarter. We’ll have more from More on next week’s Internet of Things podcast, but in the meantime I was psyched to learn that the company’s hub uses EMF detection as a way to implement gesture controls directly above the device.

More says it’s cheaper than a camera, and the fact that the NinjaSphere has a relatively fat ARM A-8 processor inside means the device has the computing power to handle the algorithms used for translating the EMF disruptions into a recognizable gesture. The goal of having the gesture on the device was to help people engage with their lights or other connected devices more quickly than one can when pulling out a phone, opening an app and giving instructions.

For more on the NinjaSphere and the philosophy behind it check out the podcast next week. But in the meantime, I’m pretty excited about having another alternative to cameras, infrared and microphones for gesture recognition.

You’re subscribed! If you like, you can update your settings

Comments have been disabled for this post