Blog Post

Why the snap of a photo changed my mind about Google Glass

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

As a serious smartphone addict who jumps at the sound of an iPhone (s goog) buzz, I know that I really don’t need more internet in my life. So I haven’t had much interest in Google Glass (s goog) so far, assuming it would just serve to put more apps in front of my face that my current attention span doesn’t need, and that my iPhone could easily handle.

But this morning when I was walking through downtown San Francisco before Google’s I/O conference, I was crossing a street when I saw a particularly pretty scene of the sun rising between two buildings. Hoping I’d have enough time before the walk signal ended, I dug my iPhone out of my bag, swiped to open the camera, snapped a photo, and then jogged to the curb to avoid getting hit by cars. (Mom, I hope you’re not reading this.)

So a few hours later, when I tried on Google Glass for the first time and said the command “take photo,” instantly capturing a photo of my colleague Kevin Tofel standing in front of me without moving either my head or my hands, I started to see the appeal of Glass.

I’d read a decent amount about the technology since Sergey Brin dropped from a helicopter at last year’s Google I/O, and not only was I sort of confused by the specifics of how Glass works (A camera on your face? Facebook on top of everyday life? How do people see when they’re wearing them?), I was turned off by the severely dorky appearance and the idea of constantly monitoring the things around you. They seemed vaguely creepy and intrusive. I was not attracted to the idea of wearing them as a normal person walking around town.

But even though I only got a short spin with the technology on Wednesday, it only took a few seconds for me to understand why people are so jazzed about Glass.

I put them on my face and was immediately impressed with how lightweight they felt. Despite their futuristic, clunkly-on-one-side appearance, they didn’t feel very bulky or heavy on my face, and it was easy to see the room around me while wearing them. (Even though they weren’t fitted specifically for my face the way they would be if I purchased them.) The screen felt much smaller and unobtrusive than I’d imagined, and it wasn’t hard to swipe the side of the glasses to navigate the screen. But it was the voice commands, and the “take photo” command, that changed my perspective on the technology.

Would I spend $1,500 on them right now? Definitely not. If you need prescription glasses of any kind, it would be hard to combine those with Glass. While Google has launched them in some jazzy new colors, you still look absurd wearing them (whether you’re in the shower or not). This probably makes me somewhat vain, but I’d want them to look cooler and less futuristic before I wore them in everyday life (seriously, embed them in some Warby Parker frames, and I’d be way more down with the idea.)

And once apps start streaming into the glasses, I can’t imagine how seeing New York Times headlines and tweets wouldn’t be distracting while you’re doing things like walking or driving. Of course, none of this even gets into the new etiquette that would have to arise from the spread of Glass.

But despite all the drawbacks, speaking the words for the “take photo” command made me realize that even if wearable computing has a pretty dorky image right now, the potential practical applications for real-life people who don’t consider themselves nerds are endless — once the technology gets a little more refined, and we figure out how to use them in public.

I talked to one Google employee who said she sat in her sister’s graduation and streamed video through Glass to family members from afar, and another who said she uses it to take photos of her little kids when her hands are full. I would imagine it could be huge for people with disabilities, or people doing outdoor sports (Kevin mentioned you could take photos of mile markers while running a marathon.)

“Every time we’ve tried to do something crazy we’ve made progress,” Larry Page said on stage today. So does Google Glass seem a little nuts right now? Sure. But if a few years from now I can snap a photo of a sunrise without having a near-miss with traffic, I’m open to the possibilities.

10 Responses to “Why the snap of a photo changed my mind about Google Glass”

  1. Alexander Dzhulay

    I wonder where you can buy this gadget? ! How much does it cost? Is it possible to test it in the field of mining – a wooded area. As far. He is good? Email me your opinion. What do you think? [email protected]

    Интересно где можно купить этот гаджет? ! Сколько он стоит? Нельзя ли испытать его в полевых условиях горно – лесистой местности. Насколько. Он хорош? Напишите мне Ваше мнение Что Ð’Ñ‹ думаете? [email protected]

  2. Guest

    How about some creep taking pictures of you in the street/subway? How about someone walking up to you on the street/subway and using his cellphone to take pictures of you from every angle? Sounds creepy right? How about someone following you with a video camera? I saw a guy with one of these. I walked up to him, all 6’3″, 240 lb of me and took out my s3 and started taking pictures of him from this side and that. He had the nerve to start objecting. I told him what he could do with his mouth and my rear end.

    How about someone taking a picture of your credit card as you pay?I could go on, but the best way to treat glassholes is to take the matter into your own hands.

    • I am sure that people have tried and succeed taking a picture of a person’s confidential information countless times before. An Iphone or any phone that has an 8 megapixel camera attached to it will have the capability of snapping a shot of someone’s credit card from 6 feet away minimum… All fault will lie upon the person oblivious enough to keep some sense of alertness when around people, especially with a foreign device.

      Also, it does not hurt to ask, ” Hello, I noticed those glasses do not look like regular glasses.What is it?”.

      Anyway, this is the future and people should try to embrace it. If not then they should just ignore it and live life.

    • Mcbeese

      I had a similar experience. Was out to dinner with a friend and there was a glasshole at the next table. I tried an experiment. I picked up my phone and pointed it at the person wearing Glass. For the first 5-10 seconds, the glasshole felt flattered to be the center of attention. After 30 seconds of me holding my phone up, glasshole looked embarrassed and said “don’t worry, I’m not taking any pictures or video.” I said, “don’t worry, I’m not either.” After about 60 seconds, glasshole took the Glass off, and I put my phone down. Message sent and successfully received.

      The social aspects of Glass are far from understood, let alone worked out.

      • “The social aspects of Glass are far from understood, let alone worked out.”

        Yet you already have a pejorative term for its users and are actively and aggressively confronting, sorry ‘experimenting’, them. Sounds like you’ve worked it out to me: You hate them and you’re going to let them know in no uncertain terms. Their having a camera on their head is, to you, the same thing as you pointing a camera at them in a pubic space.

        Here’s how ACTUAL society, not your aggressive jerky version, works. You tell the restaurant “I don’t like this”. If they agree, your ‘glasshole’ needs to take his headset off. If they don’t agree, you need to find somewhere else to eat or act like a grown up, not a petulant child.

  3. xtoddrick

    And there therein lies one of my issues with Google Glass, Siri and the like. More talking out loud when keyboards and buttons are (mostly) quiet.