Blog Post

Eye of the robot: Google working on Android-powered glasses?

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Google Goggles Diego Rivera Frida KahloGoogle (s GOOG) may be getting ready to take augmented reality to the next level: According to a report, an Android-powered pair of glasses will go on sale by the end of 2012.

If so, get ready for a new take on the concept of wearable computing. A number of smart gadgets that track your movements and assess your health havemade waves over the last year, but Google Glasses (just my suggested name) would be something unique: a network-connected pair of spectacles with a camera and a display that could record images and video of one’s surroundings and display information from elsewhere on a tiny screen, according to a report in The New York Times.

If it sounds like a potential privacy minefield, that’s because it is. There are obviously all kinds of benign implications for such a device, such as tourism or spectator sports, but it’s already easy enough for people to walk around recording each other with smartphones. The first production of these glasses will probably make it painfully obvious, however, that you’re wearing a computer over your eyes (and supplant the glasses designed for the armed forces as the world’s most effective birth-control device).

The Times cites an earlier report from that said the glasses would be operated by head movements and might resemble a pair of Oakleys. It’s likely to feature some of the technology behind the Google Goggles app, and at the price of a modern smartphone (between $250 to $600, according to the report), it might have a pretty limited audience at first.

8 Responses to “Eye of the robot: Google working on Android-powered glasses?”

  1. They should combine them with IR/Low-Light cameras to overlay on the “display” … and the sonar/IR sensors that a Japanese researcher put around the back of his head, so that he could detect movement behind him (most notably, detecting things that might bump into him; which gave him a vibrational buzz when they detected things). Add in a picture-in-picture for a rear mounted camera or two … and you’ve got a pretty impressive all-around sensor. Plus ear pieces connected to microphones (that can also be used as part of your wired or bluetooth headset to your phone).

    Think of it like that (Lobot?) guy that was Lando’s assistant at the Cloud City … only adding a heads-up glasses display to the front. Make it a little more cyberpunk-ish (flat black wrap-around band all the way around your head, seamlessly blending into the glasses over your eyes?). Perhaps have it come in other colors, as well.

    Maybe use LCD tech to make variable opacity/shading of the glasses (like variable sunglasses). And noise filtering on the ear pieces … you could have a pretty impressive product there.

    I’d probably buy that. Though, I’d probably prefer IOS over Android.

  2. madcream10

    Don’t be so naive:
    “(between $250 to $600, according to the report), it might have a pretty limited audience at first.”

    Have you considered how much people spend to watch movies on 4inch screens?

    This would be a game-changer exceeding the impact of iPad/tablet. Couple that with instant feeds and you have Borg v0.2. Synchronize contextual detection with retrieval of feed archives and you got Borg v0.3. Add brain wave i/o (IBM says 5 yrs) and then you got Borg v1.0. It will be an awesome future.

  3. This seems like an effort for android to try and place themselves a step a head of everyone else. Glasses are an interesting concept. Do we think that society is ready for such a thing? Perhaps this is a gadget that should slow down until is is perfected. While is certainly true that mobile devices are becoming popular, this might be jumping too far ahead. Thoughts?

  4. It’s based on the same idea as the Kopin Golden-i, currently being mass manufactured by Motorola Solutions (the part of Motorola Google isn’t buying) for industrial use (people working in industry needing headmounted computing, preferably not carrying around a tablet or laptop). I actually showed it to Sergey Brin and Steve Lee and other Google X people at CES, here’s a 27-minute demo of how it works: and a bit about how I try to use it for augmented video-blogging:

    The Google one I think is going to be more compact (just about foldable and to be put in ones pocket), run Android instead of Windows CE (industry needs Windows CE) thus having all the augmented reality apps available on the Android platform already such as Google+, Ustream uploader, augmented reality apps, Google Goggles etc. Getting this to work is mostly about software, the hardware is “just” a microdisplay from Kopin for example, at least 800×600 resolution, maybe 1024×768 up close to one of your eyes. It actually doesn’t have to be glasses, making it sunglasses is just a better way to hide the hardware. But without glasses it can be made as small as a couple bluetooth headsets extractable from one of your ears when you want to use it. It’s basically kind of like putting your phone up close to your eye, mostly with same processing and sensors as on your phone, and with this you are being hands free so you can more easily walk around, hold other things, do other stuff. Consider it like a dashboard to your lift, not as much as something you constantly look through, more like something you can glance down on when you need augmented info, otherwise just look with the other eye or straight ahead for a view without anything augmented.