iOS12 Product Update

iOS12 Product Update

We blogged from WWDC earlier this year [https://bit.ly/2uoFbDH],  looking in some technical detail at changes being introduced while iOS 12 was still in beta. This short piece is more specifically focused on the impact of those changes on users from a product perspective, now they are live and being installed on millions of devices.  

2018 is a very much a year of incremental change for iOS; of refinements, optimisations and enhancements aiming to improve core performance and user experience, combined unsurprisingly with major upgrades to technologies introduced last year. iOS 12 is unlikely to create a lot of headlines, but will be good for everyday usability of the platform.

(Note that support for iOS 12 goes back as far as the iPhone 5S and iPad Mini 2.)

Performance

Before discussing the functional changes, it’s worth noting that the single most important advance in iOS 12 is performance. It’s clear Apple has put a huge effort into optimising underlying CPU technology to achieve faster and smoother performance in everyday tasks.

Furthermore, load times have been reduced drastically all across the OS. Apple claims that users will be able to swipe open the camera from the lockscreen 70% faster, display the keyboard 50% faster and in general launch apps up to twice as fast.

Animations will be smoother and apps will be faster, and in turn more useful, even on older devices. This will give a new lease of life to these prior models, in many cases resulting in them feeling as good as new. Clearly Apple is investing in reputational gain with their customer base above short term device churn.

Shortcuts

Perhaps one of the most interesting and powerful new capabilities of iOS 12 are shortcuts, essentially a quick way of carrying out common tasks, effectively allowing developers to deep link to particular functions or screens within apps.

The manner in which shortcuts have been introduced is somewhat confusing, most likely because they stem from Apple’s 2016 acquisition of the automation tool Workflow, which has now been brought into the operating system and mashed up with some related Android catch-up features. They’re not just about Siri, but the emphasis is certainly on using custom voice commands. Shortcuts can be created in different ways, accessed in different ways, and come in simple and more advanced varieties.   

Simple shortcuts do just one thing, and are suggested to you in the Siri section of Settings based on what you do with your device. These are things like setting an alarm, seeing a weather forecast, placing a takeaway order or creating a tweet. If you see a suggestion you like, just record a Siri command to activate it. To make them more visible, some apps (such as Trainline [LINK BELOW]) include “Add to Siri” buttons at the most relevant points in the user journey.

More interestingly, you can also create or customise multi-step workflows using the Shortcuts app (not pre-installed), combining individual Actions made available by developers. As with simple shortcuts, you then record a custom trigger phrase for Siri. There’s a gallery of examples to get started with. Use cases could be anything from creating and posting an animated GIF to a journey home routine (check weather, look up best driving route, send text home, turn on heating).

It will be interesting to see how app developers respond to all this, and how much of it finds a mainstream audience. It’s certainly a great opportunity to be creative in finding new ways to reach users and create moments of convenience and delight. The challenge, as always, will be finding the scenarios where it can have a meaningful impact and then thinking carefully about improving user journeys.

So far, as you would expect, implementation is quite exploratory and inconsistent. Citymapper, Dark Sky and Trello have already created Shortcuts (the latter using the more advanced functionality available in the Shortcuts app). We can expect Apple to build on what has been launched in subsequent updates.

Augmented reality

The biggest announcements last year were Apple’s incorporation of augmented reality and machine learning technologies within the operating system, ARKit and Core ML respectively. Not surprisingly major upgrades to these have been announced in iOS12, and these are potentially the biggest news in terms of opportunities for creative product development.

ARKit combined stable and accurate AR technology with ease of implementation for developers. ARKit 2 keeps up the pace of development, bringing enhancements that should drive more useful and, notably, more social experiences. Apple will be hoping these new features inspire developers and lead to greater uptake of the platform:

  • Shared scenes: Multiple users can now inhabit the same augmentation environment across multiple devices, opening interesting opportunities for both entertainment and practical collaboration.
  • Persistent tracking: AR experiences can now be sustained between sessions, so objects placed in an environment will stay there if you go away and return.
  • 3D object detection: It’s now possible to detect 3D objects and use them to trigger augmentations. This very likely builds on technology from Metaio, which was acquired by Apple in 2015.
  • USDZ Quick Look: A new file format has been created, USDZ, for sharing 3D objects between apps and presenting them to users in a simple way called Quick Look. It already works in core Apple apps like Safari, Messages and Mail, and Apple will be hoping for adoption by prominent third party apps. A notable early adopter is Shopify [see link below], which supports Quick Look when you visit selected stores on Safari.

Machine learning

Core ML was also introduced last year in iOS 11, making it much easier for app developers to get started with machine learning in a meaningful way. Core ML made it simple and secure for developers to integrate machine learning models into their apps, for example to understand context in a natural language conversation or to interpret images in realtime.

The advances included in Core ML 2 are more incremental and not as immediately obvious as those in ARKit 2 to a non-technical audience, but they are significant nonetheless, improving performance and flexibility for developers in a number of ways.

Core ML 2 makes models smaller, faster, and more customisable. It allows developers to integrate a broader variety of machine learning model types into apps, and to do so in a more efficient way. The practical impact of this for uptake of ML on iOS remains to be seen.

Other enhancements

Of course, there are dozens of changes across the board in iOS12, ranging from the usual supply of new emojis to further steps in Apple’s ongoing journey of putting the user in control of their data and experience. I’ll conclude by mentioning just a few points that could be of interest to organisations delivering services on the platform:

  • Notifications - Apple continues to add nuance to notifications to make them feel less intrusive and give users a stronger sense of control.  Stacking notifications per app reduces visual clutter and allows them to be dismissed en masse, and you can use ‘Instant Tuning’ to tweak settings directly from an individual notification on the lock screen.  
  • Passwords - A powerful new autofill setting allows password managers installed by users to be on tap everywhere in Safari. This sounds a minor point but it removes a real barrier to seamless web user journeys.
  • Maps on CarPlay - Apple’s in-car integration technology allowing vehicle displays to act as screen and controller for iOS devices will now support mapping services other than Apple Maps. Relaxing control of this crucial element of the driving ecosystem will no doubt tempt in many new users.  
  • Screen Time - Far more powerful than just the shock headline figures many of us are sharing, this feature lets users self-manage their device addiction (and protect family members) through setting detailed app-by-app limits and restrictions  
  • Photos - Clearly a difficult experience to get right for the hugely diverse user needs it addresses, Apple continues to experiment with the Photos interface and has introduced some intriguing innovations drawing on AI capabilities. The For You tab is a sprawling feed of suggestions and categorisations trying more than ever to interpret and find meaning in what you’ve done with the camera. There’s also an enhanced photo search which delivers the sometimes strange results we expect of image algorithms but promises much for the future.

 

https://engineering.thetrainline.com/introducing-siri-shortcuts-for-trainline-6fa6e2b9a9cc

https://www.shopify.co.uk/blog/shopify-ar

 

Image created from photography by Alvaro Reyes

Author: Tim Johnson

Life at Somo

Contact Us

London Office

18th Floor Portland house
Bressenden Place
Victoria
SW1E 5RS

+44 (0)20 3397 3550

Bristol Office

1 Temple Way
Bristol
BS2 0BY


+44 (0)117 214 0910

Get in touch