So what’s it really like to use Project Glass? Take a look

Facebook Home(s fb) may have stolen the Android(s goog) show of late, but Google’s Project Glass hasn’t lost its luster. In a video demonstration from last month’s SXSW event, Google Engineer Timothy Jordan spent nearly an hour showing off the Project Glass hardware, discussing Google’s Mirror API for Glass and perhaps most interesting, provided a walk through of the user interface. Here’s the video; jump to the 12 minute mark if you want to see the UI bits:

Although I’ve seen short demonstrations of Glass prior, this one is the most detailed and encompassing I’ve found yet. Jordan’s Glass is connected to a projector in this case, so the audience can see what he sees.

I knew that Google Now had a heavy influence on the Project Glass experience, and it’s easy to see why in this demo: Google Now provides the type of information that’s sized properly for the small screen while providing huge, immediate benefits.

The demo also illustrates how to interact with Glass using the side panel and head gestures. Tapping brings up the Home screen while sliding down on the small touchpad is similar to the Back button in Android. Voice activation is of course heavily used as are sound responses from Glass itself. But there’s no speaker in your ear to block out ambient sound; most impressive. That’s useful for the New York Times app, which can read news aloud, for example.

Project Glass card optionsJordan spends quite a bit of time discussing the Timeline cards that are supported in Glass; these are the screens of data users can see and interact with. While I’m not a developer, I found the presentation fascinating from a UI perspective, mainly because the Glass screen is limited in size and user interaction on wearable gadgets are so challenging.