Forget touch screens or voice recognition, Thalmic Labs’ UI uses muscle movements


The more places we add computing, the more tired keyboards, mice and even touchcreens look as interfaces, which is why the latest video from Thalmic Labs, showing off its MYO arm band is so cool. The MYO armband uses muscle movements to figure out what gesture the wearer is making, and then relays that back to software. The video shows people playing Tetris by waving their hands and playing with a Sphero waving their arms.

The startup, founded last year and based in Waterloo, Ontario, reminds me of the Leap Motion guys, who are building a gesture-interface for PCs and laptops using sensors. Both are taking the relatively “flat” methods of motion-based input, such as touch or mice, and making them 3D. Now you can move not just left and right, but also above and below a plane.

So far, the MYO bracelets will ship at the end of the year for the first 25,000 people who pre-ordered the armbands, with another batch coming in 2014. They cost $149, which is pretty compelling if even a few decent apps are available. Games are a good entry point for this type of gesture-based control with the MYO sold as an accessory of sorts.

But if the company makes is SDK available, I can imagine it as a controller for a variety of things, including the connected home. Imagine wearing one of these armbands and using gestures to control your stereo from across the room or even as a way to lock the doors with a motion as you leave the house. Of course, that assumes a device that you’d wear constantly, which then gets us into the thorny UI issue of how to tell the device when you want it to pay attention and take action based on a motion, as opposed to when you are just scratching your nose or waving hello to a neighbor.

For a taste of the device and possibilities, check out the video below:


You're subscribed! If you like, you can update your settings


Comments have been disabled for this post