Thoughts | BBH Stockholm

iOS11 got us thinking about AR. Again.

Written by Andreas Jakobsson | Sep 25, 2017 11:45:38 AM

The release of iOS 11 and the new iPhone X got us, once again*, thinking about augmented reality (AR). With the introduction of ARKit and the gradual roll-out of ARCore (Google’s equivalent SDKs) compatible Android phones, we should see a democratization of reliable AR, letting more developers build neat apps without needing computer vision expertise or a massive budget.

Also, we see that users are finally getting the hang of what AR can truly offer! It can turn your face into a cat's face with a Snapchat filter, or it can be used to hunt monsters in the physical world with Pokemon Go, or for decorating your home with furniture you are planning to buy.

But what’s the deal with AR? Why are we excited about it? Well, let’s rewind a bit.

Reach out and touch it
Looking back ten years we’ve seen a massive revolution with the introduction of smartphones. With good touch screens we got the promise of direct manipulation as the main interaction style – simply reach out and touch what you want to interact with, instead of using a mouse and keyboard. This was a big deal since we humans tend to be good at pointing and touching the stuff. It feels good; it feels natural!

Over the following years, interaction patterns and design conventions evolved, making us better and better at making apps that convey their meaning and use in a smart and easy way. Smartphones have touched so many areas of our lives that they have become an extension of ourselves.

But no matter how much we improve at designing digital metaphors for things in our life, some things are better explained in the physical world. There is a gap here: That new sofa you’re planning to buy online could have the most intricate image gallery and user-friendly information, but at the end of the day you still try to picture how it will look in your living room at home.

AR to the rescue!
AR brings with it the promise of merging the digital and the physical world. We finally got the possibility to place our IKEA sofa or our big painting in our real living room before buying it, and suddenly we have bridged that gap.

We no longer have to imagine the sofa or painting where we want it and simply see it for ourselves. So sort of like what touch screens gave us regarding ease of use, skipping the cognitive step of directing a mouse, we now skip the step of imagining 3D objects in our physical world.

But it doesn’t stop there. Once that gap is bridged we can continue to leverage on what AR enables. Not only can we place “real objects” in our surroundings, but we can also place a digital layer over what we see. This gives our users a super power; making it possible to see the invisible. Where are my friends at a crowded festival? What are the specifications for that car I see parked over there? What is the size of the room I’m standing in?

What about depth?
But it is not only ARKit that sparked our imagination. iPhone X will come with a dot projector, actually born out of the same technology used in the Kinect from Microsoft. This enables a reliable depth field that makes it possible for much better detection of objects, even in bad lighting.

Further closing the gap to the physical world, we look forward to using it in adding more interaction capabilities to your phone. Apple’s example was unlocking your phone - but why stop there? We are thinking more in terms of controlling your phone from a distance with hand gestures in the air. Again – we are kind of amazing at using our hands so if we can find intuitive gestures for controlling input we have a big opportunity here. Navigating a recipe app without making the screen messy, playing a game by shooting with your finger pistols, writing in sign language. Let's just say we are excited.

Summary
TLDR; Touch screens gave us direct manipulation. We got better at making digital interfaces, but the gap to the physical world still needs to be bridged. With AR we can take the next step in “materializing” our digital artifacts in our surroundings. Not just because it is cool, but mainly because we as humans are very adept at understanding our physical world.

We are excited about the tech coming up to speed with our expectations and are looking forward to pushing this field together with our partners.

Perhaps you have a product or service that is dying to bridge the gap into the physical world?

* We at BBH/Monterosa were part of building AR experiences years ago with Magnum Pleasure Hunt 3 from 2011 and Vodafone BufferBusters from 2012