20 Comments

Summary:

Smartphones have Google Goggles, an image-recognition search app, but consumers may one day have “Google glasses.” Google has introduced Project Glass, a concept for glasses that integrate directly with Google services. It may sound like a silly project, but there’s a reason the device makes sense.

google-glasses-featured

Smartphones already have Google Goggles, an image-recognition search app, but consumers may one day have “Google glasses.” The search company publicly introduced Project Glass on Wednesday, a concept for wearable glasses that integrate directly with Google services.

Google is sharing this video to kickstart ideas and gather feedback for the connected, wearable device concept. It may sound like a silly project, but when you think about it, the device actually makes sense.

The initial design vision is for lightweight frames that don’t actually have full-sized, traditional glass lenses. Instead, a small display is available up and to the outside of the right eye so that information doesn’t interfere with viewing the world around. And as shown in the video, there would be some type of eye-tracking mechanism allowing users to interact with data on the small display — similar to a touchpad tap — although much of the interface could be controlled through spoken commands.
 

As silly as the idea may look or sound to some, I find merit in the approach, as it seems like a logical next step. We have gone from immobile desktops to portable laptops and now we are toting tablets and pocketable smartphones. Where can we go from here if not to the growing number of connected, wearable gadgets that we have been reporting on for well over a year?

From a consumer perspective, Project Glass also forwards another theme that has been growing. Touchable user interfaces have reinvented how we use mobile devices, but hardware design is advancing to the point where the interfaces are starting to disappear. Instead of holding an iPad or other tablet, people are interacting directly with an app, Web page, photo or other digital object in a reduced interface, with either voice or minute gestures. In essence, such glasses would allow people to digitally interact with the physical world around them without a device or user interface getting in the way.

This quote, told to the New York Times from someone who tried the glasses, says it better than I can:

They let technology get out of your way. If I want to take a picture I don’t have to reach into my pocket and take out my phone; I just press a button at the top of the glasses and that’s it.

I expect that any first-production version of these glasses would heavily leverage a smartphone’s connectivity, much like many smart watches do today. The glasses would likely have a low-powered wireless connection to a phone, which would provide Internet connectivity, a place to store photos or a way to shoot them to the cloud, a GPS for location purposes, and so on.

Think of these spectacles as the next iteration of a smartphone, just one that you can wear and not look geeky. Well, not too geeky anyway. I’m all for wearable gadgets, so if Google is looking for beta testers, I’m in. How about you?

You’re subscribed! If you like, you can update your settings

  1. They didn’t come up with a new idea, I guess everyone watched the TED video… even in a controlled demo, I got a little dizzy feeling, I have great chances of colliding with people, or just mis-step from the 110th floor, as if, already enough people were not dying of SMS during drive, now they would die, just by walking on the road.

  2. giulia de ponte conti Wednesday, April 4, 2012

    Reblogged this on G.DpC.

  3. ~10 years too early. Having data superimposed over our vision (and a camera capable of capturing what we see) is an obvious application and has been a sic-fi staple for ages. The technology is not there yet, however – not display quality, not data transmission speeds, not data processing and storage, not input/interface.

    If this launches soon, it will be to the ultimate, mass-adopted form of the technology what the Nokia 9000 was to the iPhone.

    1. I think you vastly underestimate the exponential speed at which technology is increasing. And a 4G is plenty fast enough to display all the information, at least those shown in the concept video. I’d say 2-3 years away, max. And it’ll probably be in beta until then anyway, knowing Google haha.

      1. I was thinking less about the information displayed, and more about the information that needs to be captured and processed (which, pretty clearly will not be done on the glasses itself) in order for the results to be displayed on the glasses in, pretty much, realtime.

        We are not close to there, and by the time we are, a more elegant solution may well present itself.

  4. What? Are they trying to turn everyone into nerd looking?

  5. This video should have been posted 1st of April.
    And those photos of smilled people with sh*ty glassess … :D very funny
    Yes, we all need a plug for Google to show us adverts based on our vision, hahaha
    What ahout during sex, advert of: condoms, hiv-test or diapers?
    What a big FAIL….

  6. You can’t hide the obvious. Ugly doesn’t sell.

  7. As if people randomly talking on their cell phones isn’t annoying enough. I applaud them for thinking big but I don’t want a computer on my face, in my pocket is close enough.

  8. I’ve been beta testing the Motorola Kopin Golden-i since last October: http://armdevices.net/2012/01/14/motorola-kopin-golden-i-at-the-verizon-booth-at-ces-2012/

  9. Looks pretty ugly even on hottie Blondie.
    Like a band-aid on a broken glasses.
    Hope Apple can make it more elegant. But without Steven not sure they can pull it out.

  10. The negative responses to head mounted displays remind me of similar attitudes demonizing early adopters wearing earbuds. Any social stigma was eventually seen as less important than the improved portable sound quality and power requirements reduced to 1/1000th of conventional speakers. These otherwise unachievable advantages were the result of the obvious physics involved. Namely, the closer the device to the sense organ the less the power required. Portable devices now have the screen as the single biggest power drain (other than the GPS, and cell radios). Moving the display closer to the eye is the only way to reduce power while improving display quality and actually achieving hands-free use. Obviously this would denigrate touch interfaces. So don’t give up on the remote cursor interface. Even this can be hands-free with a cursor steered with eye movement using “blink-to-click” together with voice.

    1. I disagree. I think you are over-simplifying as well as focussing far too much on a bias you seem to have about power consumption, which I don’t see as having much to do with it at all.

      Earbuds caught on and became “acceptable” simply because so many people were using devices that one’s life would be cacophonous if people didn’t wear them. It was the sheer number of mobile media players in use that did it, not power savings. Nothing to do with Physics at all.

      If Google glasses offer some real advantages (doubtful at best) then they will be used although early adopters will be scorned and ridiculed. At some point, if there is enough people using them, they will seem “acceptable” all of a sudden and this will drive a second wave of adoption.

Comments have been disabled for this post