1 Comment

Summary:

Using an iPhone or iPad’s camera, PointGrab is opening up its new SDK at MWC this week that will let developers bring motion-control to their iOS apps.

CamMe_iPhone5

This story was corrected on Monday to note that Gad is VP of marketing, not CEO of Point Grab, and that the company has worked with Acer, not Asus.

You can use remote gestures to control your laptop, your video game console and your TV. So why not the device you use more than any other?

In a similar fashion to how Microsoft’s Kinect brought gesture controls to the Xbox, PointGrab, using a different technology, is trying to bring hands-free, gesture-based control to iOS games and apps. Gestures can be easily learned and perhaps even more natural than learned input mechanisms like tap to zoom or pinch to enlarge, PointGrab VP of Marketing Assaf Gad argues.

“The same way we as people interact with each other, we can allow devices to understand our body language,” he told me via Skype from PointGrab’s headquarters in Hod Hasharon, Israel last week.

For example, he said, all children know the finger-to-lips “shush” gesture that means “quiet down.” So, rather than hunting for a button or volume menu to lower the volume, why can’t our devices know that action means we want the sound muted?

Motion control can be natural and fun — hands-free Angry Birds, anyone? — but occasionally necessary too: devices that understand remote gestures would let you, for example, scroll down the recipe you have open on your iPad while your hands are still wet or covered in raw egg.

How it works

PointGrab doesn’t require additional hardware to make this happen — it uses the device’s own camera. PointGrab’s technology uses motion-detecting algorithms to identify the X and Y coordinates of a person’s hand through the camera. Gad says the software can detect a single finger up to 17 feet away. Here’s a video Laptop Mag shot at CES 2013 of a free app PointGrab created simply to demonstrate its technology on an iPhone or iPad.

You can also download that app, called CamMe, from the iOS App Store yourself. I tried it out and it works as promised although it’s occasionally a bit slow: if you want a self-portrait or group photo but don’t have Inspector Gadget arms, you can use this by setting your device on something, raising your hand to turn on the camera from a couple feet away and closing your fist when you’re ready. The app counts down three seconds before snapping the photo. The downside is you’re using the front-facing camera, which isn’t as good as the iPhone’s rear camera.

Gad emphasizes that this isn’t supposed to replace a touchscreen altogether but is for use in certain cases: “We’re extending a user interface for specific scenarios in mobile.”

A gesture-controlled future

But there’s growing momentum in this space. In the living room, it’s already moved from futuristic fantasy to Thing Normal People Use, thanks to Microsoft’s success with the Kinect. And Samsung, the world’s largest TV maker, wants to replace your remote with your hand when controlling its new smart TVs.

CamMe_iPadBut the most high profile up-and-comer now trying to make gesture-based computing happen for a broader user segment is Leap Motion, whose mission is to bring 3D motion control to computers. (Its website contains the rather cheeky phrase, “Typing? Seriously? That’s fine for writing a novel.”) It recently got $30 million in venture funding to try to pull this off.

PointGrab, meanwhile, has been around since 2008. Its gesture-control software has been in Fujitsu laptops since 2010, and the company has worked with Acer and Samsung on laptops too. But PointGrab is looking past laptops now; it wisely sees the future is more mobile and is attempting to get developers interested in using the technology in their iOS apps.

The thing LeapMotion has going for it that PointGrab does not yet is active third-party developer interest — there are 12,000 developers working with LeapMotion’s SDK. But PointGrab hopes to change that soon. This week at Mobile World Congress in Barcelona, PointGrab is for the first time opening up its SDK to anyone who wants to give it a whirl.

Gad is particularly hoping for forward-thinking developers who aren’t satisfied with only touch or voice-based control of the iPhone or iPad. “I think existing applications always try to find another way for interaction or another way to suggest new features to the user. I believe our technology will ad another capability.”

PointGrab’s SDK will be free at first for those who have “good ideas” for how to apply the software to their apps.

You’re subscribed! If you like, you can update your settings

  1. Love it and hope Apple embraces it. You know that Apple had first dibs on the Kinect technology right, and they totally passed on it. Lack of vision, I believe, but if nothing else they could have bought it so no one else had it.

    Anyway, I hope they embrace the heck out of this technology and don’t try to hamper it in any way. But, that’s my Apple…..

Comments have been disabled for this post