Blog Post

Google glasses make sense as the “next” mobile device

Stay on Top of Emerging Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Smartphones already have Google Goggles, an image-recognition search app, but consumers may one day have “Google glasses.” The search company publicly introduced Project Glass on Wednesday, a concept for wearable glasses that integrate directly with Google (s goog) services.

Google is sharing this video to kickstart ideas and gather feedback for the connected, wearable device concept. It may sound like a silly project, but when you think about it, the device actually makes sense.

The initial design vision is for lightweight frames that don’t actually have full-sized, traditional glass lenses. Instead, a small display is available up and to the outside of the right eye so that information doesn’t interfere with viewing the world around. And as shown in the video, there would be some type of eye-tracking mechanism allowing users to interact with data on the small display — similar to a touchpad tap — although much of the interface could be controlled through spoken commands.

As silly as the idea may look or sound to some, I find merit in the approach, as it seems like a logical next step. We have gone from immobile desktops to portable laptops and now we are toting tablets and pocketable smartphones. Where can we go from here if not to the growing number of connected, wearable gadgets that we have been reporting on for well over a year?

From a consumer perspective, Project Glass also forwards another theme that has been growing. Touchable user interfaces have reinvented how we use mobile devices, but hardware design is advancing to the point where the interfaces are starting to disappear. Instead of holding an iPad (s aapl) or other tablet, people are interacting directly with an app, Web page, photo or other digital object in a reduced interface, with either voice or minute gestures. In essence, such glasses would allow people to digitally interact with the physical world around them without a device or user interface getting in the way.

This quote, told to the New York Times from someone who tried the glasses, says it better than I can:

They let technology get out of your way. If I want to take a picture I don’t have to reach into my pocket and take out my phone; I just press a button at the top of the glasses and that’s it.

I expect that any first-production version of these glasses would heavily leverage a smartphone’s connectivity, much like many smart watches do today. The glasses would likely have a low-powered wireless connection to a phone, which would provide Internet connectivity, a place to store photos or a way to shoot them to the cloud, a GPS for location purposes, and so on.

Think of these spectacles as the next iteration of a smartphone, just one that you can wear and not look geeky. Well, not too geeky anyway. I’m all for wearable gadgets, so if Google is looking for beta testers, I’m in. How about you?

20 Responses to “Google glasses make sense as the “next” mobile device”

  1. IMHO, these glasses makes beautiful girl on the photo above looks weird.
    I doubt she will wear those glasses daily.

    Google glasses is quite a niche product, suitable ONLY when you have your hands occupied with something else.
    E.g. driving home in the traffic jam.
    I doubt many people will adopt glasses as daily gadget to wear.
    And here is why:

    Glasses idea is invention of XIII century. And it’s construction, although drastically improved, still make them very uncomfortable to use.

    In other words: using glasses is big trade-off between their benefits on one side and discomfort of using them on another.

    1) You won’t wear them 100% your daily time.
    2) It means, you have to store them somewhere = one more case you need to take and care of.
    3) Glasses are quite less durable than solid piece of glass or plastic – they have earpieces.
    4) Glasses never fit 100% to your face and nose = you always have to correct their position on your nose.

    Using glasses as daily communication gadget is nightmare, because using glasses daily is nightmare.
    Especially for those, who never tried it before.
    I don’t think “augmented reality” will make glasses a replacement of smartphones for daily use.

  2. Julien Chabe

    Real question is would Sergey use that shit ?
    If you use it, don’t you think that you will just be on reply mode, there is nothing proactive in this service, you just consume and reply to content. That’s what most people will do. I think that it is really interesting to create that but if we do like emails.. (noone really know how to use emails) then we create bullshit stuff just to make some cash.

  3. batjam

    The negative responses to head mounted displays remind me of similar attitudes demonizing early adopters wearing earbuds. Any social stigma was eventually seen as less important than the improved portable sound quality and power requirements reduced to 1/1000th of conventional speakers. These otherwise unachievable advantages were the result of the obvious physics involved. Namely, the closer the device to the sense organ the less the power required. Portable devices now have the screen as the single biggest power drain (other than the GPS, and cell radios). Moving the display closer to the eye is the only way to reduce power while improving display quality and actually achieving hands-free use. Obviously this would denigrate touch interfaces. So don’t give up on the remote cursor interface. Even this can be hands-free with a cursor steered with eye movement using “blink-to-click” together with voice.

    • Jeremy Bee

      I disagree. I think you are over-simplifying as well as focussing far too much on a bias you seem to have about power consumption, which I don’t see as having much to do with it at all.

      Earbuds caught on and became “acceptable” simply because so many people were using devices that one’s life would be cacophonous if people didn’t wear them. It was the sheer number of mobile media players in use that did it, not power savings. Nothing to do with Physics at all.

      If Google glasses offer some real advantages (doubtful at best) then they will be used although early adopters will be scorned and ridiculed. At some point, if there is enough people using them, they will seem “acceptable” all of a sudden and this will drive a second wave of adoption.

  4. Yingkuan Liu

    Looks pretty ugly even on hottie Blondie.
    Like a band-aid on a broken glasses.
    Hope Apple can make it more elegant. But without Steven not sure they can pull it out.

  5. Joe Tierney

    As if people randomly talking on their cell phones isn’t annoying enough. I applaud them for thinking big but I don’t want a computer on my face, in my pocket is close enough.

  6. This video should have been posted 1st of April.
    And those photos of smilled people with sh*ty glassess … :D very funny
    Yes, we all need a plug for Google to show us adverts based on our vision, hahaha
    What ahout during sex, advert of: condoms, hiv-test or diapers?
    What a big FAIL….

  7. ~10 years too early. Having data superimposed over our vision (and a camera capable of capturing what we see) is an obvious application and has been a sic-fi staple for ages. The technology is not there yet, however – not display quality, not data transmission speeds, not data processing and storage, not input/interface.

    If this launches soon, it will be to the ultimate, mass-adopted form of the technology what the Nokia 9000 was to the iPhone.

    • Alex Murphy

      I think you vastly underestimate the exponential speed at which technology is increasing. And a 4G is plenty fast enough to display all the information, at least those shown in the concept video. I’d say 2-3 years away, max. And it’ll probably be in beta until then anyway, knowing Google haha.

      • I was thinking less about the information displayed, and more about the information that needs to be captured and processed (which, pretty clearly will not be done on the glasses itself) in order for the results to be displayed on the glasses in, pretty much, realtime.

        We are not close to there, and by the time we are, a more elegant solution may well present itself.

  8. Imran Jafri

    They didn’t come up with a new idea, I guess everyone watched the TED video… even in a controlled demo, I got a little dizzy feeling, I have great chances of colliding with people, or just mis-step from the 110th floor, as if, already enough people were not dying of SMS during drive, now they would die, just by walking on the road.