MIT's Thin LCD Lets Your Fingers Do the Talking

The geeks at MIT’s Media Lab are at it again, today showing off a thin LCD screen that can respond to both touch and gesturing. They call it a bidirectional screen, or BiDiScreen for short. I’m not in love with the name, but the ability to add gesture to thin mobile phones, tablets or netbooks containing LCD screens could be used to increase functionality on such devices.

Large-screen touch interfaces like those found in the HP TouchSmart and the Microsoft Surface table are closer to gestural displays because they rely on cameras embedded in the corners of the screens to sense where your fingers are. However using that type of interface on smaller, thinner screens would make the screens too bulky. Instead, the MIT folks took advantage of LCD screens equipped with optics sensors, which should hit the market soon.

The Media Lab folks created an algorithm that allows those sensors to mimic a pinhole camera and black-and-white images to track movement on the other side of the screen. They then wrote code that helped the computer make meaning of those movements so that a wave of your fingers could manipulate an object on the screen.

However any device that hopes to incorporate gesture will have to figure out when to turn the gesture-sensing components on, much like programmers had to develop ways for a touchscreen to differentiate between touches that mean “do something” and those that are just meaningless fumbles for the phone. I think we should be able to enable gesture recognition via a tap followed by a friendly wave at the screen. After all, our gadgets are becoming closer and closer to personal assistants — we might as well give them a friendly greeting.

Image courtesy of MIT

loading

Comments have been disabled for this post