Google adds remote controls to Google Glass Android app

project glass google

Google Glass just got a little easier to use, assuming that you can get your hands on Google’s wearable. The product, which is limited to developers and beta testers for now, uses a companion app called MyGlass to configure the headset. A new update to MyGlass adds remote control and input options from a connected Android phone.

Google didn’t share many details of the update, which arrived in the Google Play store on Thursday, saying only this about the addition:

“Remote control/input for screencast. Touch/swipe/tap to control the Glass UI through the screencast experience.”

By “screencast”, Google means showing the Google Glass user interface on a connected phone. That lets others see what you see through Glass and helps to demo the device’s features.

Google Glass Now Weather

After the MyGlass software update, however, the connection becomes more of a two-way pipeline. Swiping on the phone screen would navigate through the Glass UI as if you were swiping the side touchpad included with Glass. I could see this being useful for text input as well. That’s not all though.

Here’s a thought from Google Glass Explorer, DeWayne Lehmnan:

“This could dramatically change how Google Glass is used. In fact, imagine this. You side load an app into Glass, like Ingress. You use Glass for all viewing and only pull your phone out to play and perform actions while most of the time you enjoy a gyroscopically controlled map overlay in Glass.

Imagine controlling devices like a quadcopter on your phone while controlling a camera mounted on it by moving your head. You could literally drive a car with this naturally.”

According to Lehman, Glass likely needs a software update to support the new phone app because the remote control features aren’t working just yet. Google updates the Glass software every month, however, so it shouldn’t be a long wait for the new interface option.

loading

Comments have been disabled for this post