Keen not to be left behind by Apple’s increasing repertoire of multi-touch interface control gestures, Microsoft recently previewed a new technology called “SideSight.” SideSight is not just Redmond’s version of Apple’s tech, though. In fact, Microsoft’s new offering is not touch tech at all. More like proximity tech.
Gearlog provides an overview of SideSight, based on a paper presented by Microsoft U.K. at the User Interface Software and Technology conference. The paper describes the new input tech in the context of touchscreen interaction, which it claims is unsuitable for small devices which, naturally, have smaller screens. It’s a good point. Even on the iPhone, my meaty digits occasionally obscure some important piece of information.
How does SideSight, ahem, sidestep the problem? By allowing users to interact not only with the device directly, but also with the surrounding space. Using outward facing optical sensors lining the device, movements made by a user on a surface beneath or in the air around it are detected and translated into control actions. Gearlog provided these examples of how this might work in practice:
Pages could be panned and scrolled by moving a hand up and down, and Microsoft also proved that text could be entered and edited on the main screen through a stylus while the other hand scrolled the page — a movement that would be akin to the motions a user’s hands would make if he or she were writing on a sheet of paper.
So should Apple be wary of Microsoft’s latest foray into hand-waving? A lot will depend on third-party support, and integration with Redmond’s own future products. While cell phones are clearly a target market for the tech, the report also cites PMPs and watches as candidates. While I can see the appeal of SideSight in things like eBook readers, I have a hard time picturing a lot of consumer interest in watches with gesture control. What do you need to do with your watch that would require you to flail your hands around like a magician about to pull a rabbit out of a hat? And does Apple even care about those markets? Probably not, since Steve Jobs doesn’t even seem interested in the netbook market, which is much closer to their core business.
That still leaves the possibility of home computing, especially when tech like Surface moves downstream into consumer markets, and with software support on the OS side with Windows 7 and upcoming versions of Windows Mobile. Which means a lot of orchestrating needs to happen between now and then, and as it stands, it’s still not clear how intuitive SideSight is or has the potential to be. Multi-touch works because people don’t have to think about learning it. Mastering a complex series of gestures performed in mid-air is a different story.
In the end, as with most of Microsoft’s innovations, the payoff won’t come until long after the announcement. Even then, it will probably be disappointing. Therein lies the major difference between the two companies’ innovation policies. Apple keeps things hush-hush and then wows you with little warning (though we try to spoil the surprise) while Microsoft tips its hand and under delivers. In its infancy, SideSight is a fairly interesting interface technology, and worth talking about, but just think of all the babies Apple isn’t showing off.