Reflecting on six years at Somo
Somo's US-based Product Director reflects on his journey so far at Somo, and what he has learned along the wayRead more
It’s June, and with that Apple brings back WWDC, a week where they share all the things they have been working on, both in terms of consumer software and new tools and frameworks for developers to create even better apps and services across their whole ecosystem.
At the start of the week, there are two main events: the inaugural keynote which is mostly oriented to customers and press, and the “Platforms States of the Union” event which is a bit more technical. In these sessions, we have seen Apple once again focus on differentiating themselves from their competitors around privacy, highlighting how some data is processed directly in the device, while cloud-stored data is encrypted end to end. They also reiterated that they don't use this data for advertising, but instead give full control to users to decide who can access their personal data, and when.
Besides privacy improvements, there have been important changes and new features across all platforms. Let’s take a look at the main ones.
This was one of the main changes last year on Mac and, as expected, Apple has brought dark mode to iOS. On top of a new look for the operating system and the apps, it should also improve battery life in devices with OLED screens (e.g. iPhone X, XS and XS Max). Unlike backlit screens where power consumption is more constant, in these screens individual dark LEDs use less power. As with MacOS, Apple provides mechanisms to developers to make the apps that adapt to the user’s preferred theme, such as allowing theme-specific assets, semantic colours, adaptive materials and so on.
Apple is providing a new UI component, whose behaviour is similar to what we have seen in apps like Mail. It provides a mechanism to display data in the form of a card. This card is stackable and comes with some integrated gestures, such as swipe down to discard. Benefits of Apple providing these type of components include more consistent UI across apps, making them easier to use for customers and easier to implement for developers.
These are just two examples of many refinements across the OS. The first one is an update of peek and pop actions that were introduced with 3D Touch. It’s now easier to display actions in the app and keep a unified UI across the system and across all the platforms. SF Symbols, on the other hand, improves integration of symbols into apps; they’re packaged as images but they behave like fonts, which makes them easier to scale to any font size. Apple provides 1500 built-in iOS system symbols that can be found in Xcode.
All system apps have been improved, and many, like photos or reminders, even completely redesigned. Others, like maps, have lots of new features, like a version of Street View or a way to create collections of points of interest.
This is a new operating system based on iOS, that looks to extend the capabilities of the iPad, bringing custom features that make the most of its screen size and form factor.
Multiple apps can now be opened in different areas of the screen at the same time, as well as the Slide Over stack, allowing content to be displayed from multiple apps. This provides more flexibility and makes the iPad closer to a laptop window management experience.
Apple provides a new UIWindowScene API to enable and handle the behaviour of the apps in these scenarios, with each scene representing a single instance of the app UI. Prior to iPadOS, the AppDelegate was responsible for handling app processes and UI lifecycle methods. Now, UI lifecycle is handled by a SceneDelegate which is completely independent and can have multiple instances.
As a consequence of having multiple scenes and UI instances, Apple is also releasing new mechanisms to persist and restore the state of the app through NSUserActivity.
A new framework allows low-latency drawing in the apps, using the same engine Apple uses in their native apps like Notes. This framework allows to access the canvas objects and palette object individually, so apps can use these tools as needed.
Users have easier ways to select text integrated into textViews and webViews. They can also do and undo actions through gestures; this is still handled through NSUndoManager, so apps won’t be need to be updated to use these new gestures. However, for any other controls, or if these gestures clash with existing gestures in the app, Apple also provides the UITextInteraction API to customise behaviour.
WatchOS 6 is the new version of the OS for Apple Watch and makes an important step forward, allowing the watch to be used independently from the iPhone. We're also seeing an increased focus on driving advances around health.
One of the most important changes around WatchOS is that apps no longer need to be an extension of an iPhone app. Until now, developers were forced to create an iPhone app for any Apple Watch app, and WatchOS apps relied on a connection to the iPhone. With WatchOS 6, it will be possible to create apps for the watch alone, without any dependency on the phone. It’s even possible to target the watch for push notifications only. Additionally, there is support for CloudKit, Complication pushes and input text fields in WatchOS apps.
In just the same way, the App Store is another place where the watch is getting more independent. There's a dedicated App Store that allows users to browse, search and install apps directly from the watch.
It’s now possible for some apps to keep running in the background after the user lowers their wrist, by making use of the Extended Runtime API. It is also possible to stream audio directly to the watch through multiple frameworks like NSURLSessionStreamTask or AVFoundation.
As in the previous years, new watchfaces and Complications are available along with new apps, like a voice note recording app or a hearing app that will warn the user when they are exposed to loud noises for an extended period of time.
Last year's WWDC showed Apple introducing a very ambitious project they were working on internally called Marzipan. The goal was to make it easier to port iPad apps to the Mac. This year, they are making the project public and available to all developers. This is possible thanks to porting more than 40 frameworks from iOS to MacOS, integrating UIKit and mapping some touch interactions to mouse interactions.
In addition to platform-specific improvements during this WWDC, they have introduced other important changes that apply to all platforms.
Swift UI is a new framework that completely changes how UI is built on Apple platforms. These are the main advantages:
Less code: with Swift UI the amount of code required to create the UI is far less than with previous approaches. Animations are also much easier to create.
Better code: a declarative approach is used, where the app can know upfront about how it will need to draw different states.
Automatic customisation: SwiftUI provides automatic support for dark mode, spacing, layout insets, localisation, right to left languages, and so on.
Consistency: it's available cross-platform, meaning the same UI code could work on an iPhone, an iPad or an Apple Watch.
Swift UI provides a live preview both in the simulator and on real devices
which means that any changes in the UI code or in the designer provided will be automatically updated without having to rebuild the app.
Augmented reality is an area Apple has been working on extensively in the last couple of years. And they just announced features in multiple areas, from making it easier to create 3D content to creating complex AR capabilities.
Reality composer: This is a new app from Apple that allows users to create AR assets without previous 3D experience.
RealityKit: A new 3D engine to integrate in 3D content into apps.
Provides support to the front and back cameras
People occlusion: apps are now able to place people in an AR scene in real-time, allowing scene elements to be hidden by a person passing in front of them.
Motion capture: using machine learning techniques, body motion can now be tracked using a single camera.
Up to three faces can be tracked at once.
Apple Pay in AR Quick look: an Apple Pay transaction can now be completed without leaving an AR Quick Look experience.
Just as with ARKit, Apple has kept evolving this framework, supporting a wide range of new scenarios and features.
Image saliency: this provides a heat map to highlight areas of interest in an image.
Text recognition: this provides a simple way to find and read text from an image.
Meaning embedding: this provides a way to find words with similar meanings in a sentence.
On-device speech recognition:
More languages supported (up to 10)
Speech saliency: it's now possible to interpret pronunciation, pitch and other speech characteristics.
On-device CoreML training: this allows an existing machine learning model to continue evolving when supplied with more data.
Create ML: a dedicated app to create machine learning models without previous machine learning knowledge through the use of templates and a simple UI.
Siri now accepts parameters, making it possible to build a more conversational experience based on user input.
Shortcuts are now integrated directly into the operating system.
Shortcuts can be used to apply HomeKit automation.
One of the biggest improvements in this area is that Apple is using new technology to provide full support for managing Apple devices completely with voice.
This WWDC has been one of the most feature-packed, including new features for customers (and even new hardware like the MacPro), completely new tools, frameworks and ways to work like Swift UI, a transition around how we work with some devices, like providing a dedicated operating system for the iPad, making the Apple Watch much more independent and making it easier to bring apps to more platforms like the Mac. It has therefore been one of the most exciting WWDCs in years, with lots of opportunities and areas to explore.