The nightmare future of Google Glass in the workplace


This is a new series in which I take tech news and extrapolate it into a horrid future nightmare (that just might come true).

It might be because I’m a paranoid pessimist, but I have a hard time seeing the positives in many of the world’s technological advancements. Somehow, I’m always drawn to the most terrible, jarring result of whatever wonderful new gadget hits the market. Now, to get rid of this angst and not have to bother my colleagues with grim visions of the future, I’ll be channeling it straight to you, our readers, like a sad spirit barfing depressing ectoplasm out of this medium’s communication hole.

This week, Google released their new and improved Google Glass Enterprise edition. The technology – but more importantly, human acceptance – has matured enough for people to use it around things like heavy machinery and helpless sick people.

In a lengthy Wired story chronicling the comeback of Glass, the revamped AR glasses are praised for bringing “huge gains in productivity” and transforming cumbersome tasks. “Factories and warehouses will be Glass’ path to redemption,” writes Wired’s Steven Levy.

Of course it’s great that an assembly line worker can get complex instructions on what part goes where by only looking up. But on the other hand, did we completely forget why people were uncomfortable with Google Glass when it just came out?

Why would this be any different in the workplace?

Having a factory full of people getting step-by-step instructions on how to assemble a product means you’re collecting data on how quickly each of these individuals are performing each of these steps. It also means it’s easier to decide on an optimal time, a target for each screw driven or bolt tightened.

Remember those stories about Amazon warehouses filled with people being ordered around by a handheld computer that times and tracks everything they do? Remember the pity felt for the inhumanness of it all?

The new Google Glass is so much more. It’s not only a handheld computer; it’s a camera, microphone, and speaker, too! An overseer – automated or human – doesn’t have to look over your shoulder, he (or she) can see through your eyes. But “huge gains in productivity,” right?

Taking an even darker turn, because that’s what this doomsday column is for, why not centralize the overseer role somewhere cheap? Instead of a manager hovering over your shoulder, there could be a kind of helpdesk that monitors lags in productivity and prods workers who are falling behind or checking their phones. Or looking away from the task at hand.

Another possible scenario would be that Google Glass allows for the hiring of less skilled workers, since everything they need to know is right above their right eyeball. Or as Levy put it, “The company is particularly excited about how Glass helps with training—cutting the time from 10 days to only 3.”

It’s funny, actually: Following persistent step-by-step instructions basically makes us a more flexible robot with better dexterity and more susceptibility to peer pressure. Also, very replaceable.

Now I don’t want to be all grim … Wait, no fuck that, there is no place for upsides in this piece of text. I’m not leaving you with a happy ending.

Google Glass’ introduction to the work space is the first step into a world far scarier than the one portrayed by the automation of jobs. It’s the world of automation of productivity, in which algorithms determine that we can type quicker, walk faster, lift more, and think less.

“OK, Glass, Proceed.”

Let’s block ads! (Why?)

The Next Web

Get real time updates directly on you device, subscribe now.

Subscribe to our newsletter
Sign up here to get the latest news, updates and special offers delivered directly to your inbox.
You can unsubscribe at any time
You might also like

Leave A Reply

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More