Ubiquitous, gesture-controlled interfaces are one step closer to reality, thanks to a new system developed at Carnegie Mellon University. WorldKit lets you create interactive apps on any surface just by waving your hand. The project was announced by the university on Thursday.
Instead of being tethered to your hardware, WorldKit is designed to make access to computing instant and mobile by making the world your touchscreen. Right now, the system involves a ceiling-mounted camera and projector that record hand movements and then project onto the surface of your choice. Some potential uses include TV remote controls, which can be accessed by rubbing the arm of a sofa, or calendars that can be swiped onto doors.
With projectors and depth-sensing cameras (the current system uses a Kinect) getting smaller, the researchers envision a system like WorldKit could eventually fit into a light bulb. Any room thus equipped could become a smart environment, where objects and walls become display surfaces. One member of the research team, Chris Harrison, previously worked on the Skinput device that allows users to turn their own arms into touch interfaces.
In the future, users should be able to design their own interfaces with WorldKit. The system currently allows for things like buttons, multitouch drawing (akin to a whiteboard), and counting the number of object within an interaction “bubble.” The existing prototype still has limited resolution and input dimensions, but hardware advances and future research could allow voice commands or even interaction in free space rather than on surfaces. The CMU team will be presenting their work at CHI2013 on April 30.
Image via Chris Harrison/Carnegie Mellon University