Apple’s iPhones Now Support Augmented Reality, So What Does That Mean for UX Designers?
Apple recently debuted its latest line of iPhones, the iPhone 8, 8 Plus, and X. What makes them stand out is Apple’s support of augmented reality (AR) in all three models. Apple’s throwing so much weight behind AR that not only are its new iPhones optimized for AR, but so is iOS 11. Thanks to ARKit, Apple’s AR SDK, developers can now create AR apps for Apple devices.
Apple CEO Tim Cook boasted that AR will be even bigger than virtual reality (VR). Apple is investing a lot — both in development and upfront credibility — in its hopes for AR.
This raises the question, what does this mean for UX designers?
The big difference.
Before Apple’s embrace of AR, designers were forced to build AR capabilities into the actual iOS apps. Not only did that take longer, but it also meant uneven UX. Since there was no standard AR engine, the interaction and viewing experiences would be wildly different from one app using AR to the next.
Apple making AR part of the iPhone and its operating system is, therefore, a game changer. It’s going to be more accessible for designers who want to work on AR apps, and it’s easier for designers to give users a better UX.
Google, with its competing Android devices, also rolled out an AR framework called ARCore, which lets designers create AR apps on Android, too. However, unlike iOS, Android needs to accommodate a plethora of various hardware configurations — hampering app design and development to a degree — when there are updates. While iOS updates are seamlessly rolled out to all users simultaneously, Android updates are much slower to come to market (and more chaotic) because of the processor, hardware, and wireless-carrier issues involved.
The end result is that most Android users typically don’t run the latest versions of software. From a designer’s standpoint, many users won’t get to experience their fine tuning or improvements to apps. That also undercuts the profit motive to roll out the latest, hottest AR apps for Android.
Saves time for UX designers.
Before ARKit, designers who wanted to dabble in AR apps had to invest quite a bit of time. ARKit is a framework that runs inside of the app, allowing developers who write AR software to do so more efficiently. The key behind this is the set of basic standards within ARKit for AR apps that use the toolkit. Using the same standards naturally means faster development and an easier rollout.
Closeup of designers creating mobile app prototype
Now, designers are free from having to worry about the preliminary, basic work of future AR app creation. ARKit handles that for them. Gone are hours of spending time creating proprietary AR code and the frustrating trial-and-error process of making sure the code works well on all platforms.
In theory, this should also have a positive effect on the ease with which cross-platform apps are developed.
Built-in advantages.
ARKit addresses some of the typical problems designers normally ran into when building AR apps. The following features give designers an edge in AR app development:
- Visual inertial odometry. This fancy term is known as world tracking in layman’s terms. The iPhone’s motion sensors/camera let ARKit home in on specific points in the environment (a table, a chair, etc.) and track them when users move the smartphone. This technique secures the object in its place in the environment — allowing designers to accurately adjust things like angles and scale.
- Scene understanding. AR-optimized iPhone cameras can spot real-life, visual objects and classify them as horizontal surfaces, serving as the basis for 3D, graphic scenes.
- Light estimation. This technique is meant to produce adaptive shadows and lighting, resulting in accurate 3D models, which are created in space above the real-life video taken by the camera.
Helps UX designers create better UX.
At the end of the day, the goal of new tech accessibility is to serve the user by creating better UX. The sheer accessibility of AR in the newest iPhones means UX designers will be naturally encouraged to offer users improved UX compared to apps without AR — just because they can do so much more.
“AR has a huge potential to help people complete tasks with less mental effort and fewer steps, primarily by providing support for combining multiple sources of information in a way that was not really possible before,” explains Raluca Budiu, director of research at UX-research firm Nielsen Norman Group.
“For example, before AR, if I need to assemble a piece of furniture, I had to look at my phone for the instructions, then look at my parts and identify them, then look back at the instructions and start working. As I assemble the piece, I need to keep the instruction in my memory and follow it (or interrupt myself and go back to the text if I forget what I need to do, or if I am confused). My attention is always divided between the different sources of information, and each time I am switching from one to the other, I have to work to recover context — for example, finding the right instruction step on the page.
Couple Putting Together Self Assembly Furniture In New Home
“With AR, all this back-and-forth disappears,” Raluca says. “I can have my different pieces pointed out to me directly and I can be shown what I am supposed to do right there, in the right context. I save both interaction cost — navigating to the instructions in my app, and cognitive load — I don’t need to remember instructions as I work on the object.”
Thanks to ARKit and AR-optimized iPhones, Apple has essentially mainstreamed AR. In the near future, we’ll see whether the public will be equally quick to embrace AR on their phones. If last year’s Pokemon Go was any indication, AR’s future on smartphones looks bright.