The first iteration of anything is laughable in hindsight. See if you can find a first generation iPhone. Today they look clumsy and odd.

I came across a mid 90s cell phone the other day. It was a brick with an antenna. This year we’ll see phones that look exactly like a watch. Once an idea is manifest in a physical form, evolution begins to accelerate.

You have to start somewhere.

google glass 3 on hr examiner john sumser march 24 2014

Google Glass is the gateway to a future of wearable devices and computing embedded in stuff. While that future is particularly hard to map out, you can see it in investment patterns. VCs are turning their attention to hardware and manufacturing.

With Glass, it’s starting with 21st Century hardware and contemporary manufacturing. It’s starting with a device that is significantly less finished than the first generation offerings from earlier times.

The stuff of everyday life will be things that talk to each other with or without human intervention. They will be built out of parts made with a 3D printer and feature integrated software. No bulging factories, no robots. Just a few humans assembling components from a variety of sources.

Glass is simply the most visible sign of the change.

It’s a delight to experiment with it and to try to see ‘where the puck is going”. One thing is clear. The next generation of stuff is going to be radically connected. Glass is a first taste of that new world.

Being a Google Glass user creates experiences on multiple levels because it asks me to take on several simultaneous roles. To communicate effectively about the device, each role’s insights need a separate introduction.

  • Product Buyer
    Purchasing my pair of Google Glass frames and getting them assembled and fitted has been an entire story in itself. In order to effectively distribute the device as a part of a pair of glasses, Google must integrate one of the oldest retail industries (optometry) with the newest. Sometimes it makes for good comedy.
  • Product Tester
    In this bucket are the experiences of using the tool and its hardware/software interfaces. Being in permanent beta means taking much of the experience with a grain of salt. This is the place where ideas for new things, suggestions for fixes and insights about use cases go. For instance, the Glass device produces prisms and distortions in my lenses at night. The device turns itself on as a result of head movement (it’s an adjustable setting). The result is surprise bright light in your eyes. It’s not always safe at night. Oh, and I really want to be able to zoom into the pictures.
  • Product Advocate/Marketer
    Even the version I use (which is much lower profile than the pure glass device – it’s just a part of my glasses) draws attention when I’m out in public. Surprising numbers of people have no idea what the thing is. Still others have stories and dreams to share. Most everyone who asks wants to know what it’s like and if they can see it up close.
  • Product User
    This s different from testing. It’s the set of answers to the question “what’s it like to wear the device, what do you learn”?  Glass is appropriate in some settings, not in others.
  • HRTech Researcher
    The first four elements are the learning curve you have to crawl up in order to ask and answer “Is this thing useful in HR? or What does it tell you about the future of HRTech? or What’s the workplace of the future like? or Are there any uses for wearable tech in HR?”

In other words, being a Google Glass user provides an amazing flow of experiences that jumble up on each other. Going to the optometrist (a great customer service story) flows into Google’s retail and support process (not so good). A trip into the bathroom at the theater produces oddness that’s hard to describe (they’re prescription glasses so there’s no choice in wearing them or not). Personal relationships change when recording capacity is introduced obtrusively.

There’s a big difference between having all of the information in the universe in a small device in your pocket and having some of it right in front of you all the time. Proximity to information makes it possible to make decisions in light of having the information. In your pocket, decisions are often made in spite of having access to the material.

One of the things Glass does really well is listen to you speak and then give directions. The voice to text translation is nearly flawless. The maps in the directions are works of elegant simplicity. Every step of the way, Glass gives an estimate of trip duration. It’s accurate and includes the impact of accidents and other traffic interruptions.

That means you can tell when something is wrong before you leave. The other night, the device said that the trip from Berkeley to home would take 2 and a half hours (versus its usual late night 90 minutes). That was a clue I should have taken. By the time I was stuck in the nasty traffic jam, I’d been trying to figure out why the time estimate had not gotten shorter. In future iterations, the device will tell me why the best route is long and roundabout.

That’s the sort of predictive information we’re going to start having at our fingertips. It’s not some crystal ball gazing wizardry. Instead, it’s “What happens next?” when you know what happens.

As you can see, getting squared away in the process of understanding Glass takes some doing. In the next installment, I’ll take a look at some of the little details and how they work.

Read previous post:
HRExaminer Radio: Episode #49: Sylvia Vorhauser-Smith

This week on HR Examiner Radio John Sumser interviews Sylvia Vorhauser-Smith, Senior Vice President of Research at PageUp People.