In the beginning of June at the World Wide Developer Conference, WWDC, Apple announced iOS11  the next iteration of Apple’s tremendously popular mobile operating system that powers virtually every iPhone.

When released this fall, iOS 11 will bring some major improvements to the user interface for both iPad and iPhone, but more importantly it also brings new features that will be available for developers to utilize within their apps. The goal of the update is to refine the user experience as well as making the system appear smarter – and more proactive – than ever before.

It quickly became quite clear from the opening keynote– i.e. the unveiling of the system update – that Apple wants to play a bigger role in recent trends – and more concurrent ones – of software development; for the first time Apple will officially support Augmented Reality applications, as well as Machine Learning functionality, within the development kit for iOS.

So let’s take a quick peak at what those changes will bring to the developers and end users.

Cameras were just improved
It has been a long standing rumor that the next generation of iPhone, presumably named “iPhone 8”, will have the ability to capture photos with depth information. This will allow the developers to get a so called “depth map” from an image – basically a black and white version of the image where the brighter areas appear closer to the camera – as seen on the image below. 

This will allow some really cool image manipulations (like adding post processing filters) based on the distance, improved object recognition, and allowing interaction with the device using your body as input device – as Tom Cruise in the movie “The Minority Report” – or play games (such as tennis) using your body – as commonly seen on Microsoft’s Xbox One accessory “Kinect”.

It was a bit surprising that Apple announced this new functionality to the already existing iPhone 7 Plus (it requires to cameras, thus iPhone 7 Plus is the only supported model for now) for iOS11. Maybe it’s a way for Apple to have the developers working and experimenting with the technique until the next iPhone arrives. Worth noting is that Google is working on a equivalent technique for the future versions of the Android platform, with the project name Tango.


AR Girl Flowers

Augmented Reality
The first major new feature is the Augmented Reality Kit, which allows app to “blend” fictional 3D objects and information with the surrounding environment, using the camera of the iPhone or iPad, to provide contextual information about either the physical or the virtual object. 

Apple showed a demo app (as seen in the picture below) that uses this new functionality to do furnishing using a preset of furniture, to place lamps or other items on a physical desk. It was also announced Ikea will use this feature for their catalogue of products so people can try out furnishing in their home before buying the products. It also easy to see the appeal for gaming, by using this technique to have a full action game taking place inside the room you’re sitting in.

This is not a brand new technique, but Apple’s implementation seems to have several advantages in performance and accuracy compared to many competitors, which may allow for a better experience – and thus more usage applications.

AR IKEA Lamp
Machine learning
The perhaps biggest addition to iOS11 is the support for machine learning – a feature not always visible to the user, yet one that can take place in many kinds of apps in various ways. Machine learning is a methodology to “teach” a computer to see similarities between data sets, without the developer having to explicitly define the rules – instead, the computer is provided with a chunk of data with an explanation of what it includes – so it can teach itself.

The most notable example of this is object recognition in images, where the phone can detect what objects that appear in focus (as seen in the picture of the banana).

AR Machine Learning
So why is this important? It’s often quite easy for the user to determine if there is a banana in the photo or not – even without machine learning. First off, phones may help people who cannot do this distinction themselves. Secondly, the phone can also provide relevant information to that specific object. When scanning a banana the app could provide useful information such as the price from various grocery stores, nutritive content values and more. Lastly, a computer has the time to go through a large data set (for instance many pictures) and find all the bananas without the user having to do that work.

The machine learning capabilities of iOS11 also goes beyond that. It’s also able to detect and follow motions of objects and find faces and detect where they are looking. There is now also built in support for text and handwriting recognition, which allows the device to for instance scan club cards and documents and bring them into the digital world – without having to type it manually – by using the camera.

Other cool capabilities are a well featured natural language processing engine, which allows the device to interpret text and understand sentences and their meanings. This can be used to determine feelings in a sentence and provide appropriate content for it. It can also be used for predicting texts (what the user might intend to write) which not only allows for fast typing keyboards, but also adjust the interface of the app to reflect the user defined content.

And so, so much more!
iOS 11 is, like its predecessors, a big leap forward for developers and includes loads of other smaller interesting features. Some of these features includes – but does not limit to – allowing to rate the app from within the app, access to the NFC hardware (which allows iPhones for the first time to read NFC tags like club cards, and public transport cards, so we can maybe finally leave our wallets home), a new generation of AirPlay with support for playing content in several rooms at once, password autofill (which lets the phone remember the passwords for us, since we have too many of them to remember nowadays). And much much more!

To take advantage of these new features, apps must be updated to support them. There are also new technical requirements that may require old apps to be updated to work with the new system. One of those are the migration from 32-bit apps to 64-bit apps. Many apps have already done that migration and if the app has been developed by BBH in recent years you’re likely already set!

iOS11 should be released later this fall (presumably in September, if Apple follow the previous release patterns) and it will be an exciting time for the developers to bring out new features and ideas – and in turn even more exciting for the end users to use all these new cool apps.