UX of ARKit Projects - Part 1 (Bad UX)
With the release of iOS 11 and ARKit, and with our own coming ARKit titles, we decided we should take a look at how multiple apps are onboarding their users into the AR environment. Some do it well. Others - not so much. We've been working in AR for more than 5 years, so we know many of the challenges involved in showing users/players how to interact with AR space. Our goal is to help shed light on how the best app shops are bringing their users up to speed on familiarity with working in AR, and hopefully help shortcut the process for new devs working in AR for the first (or a second) time.
Sometimes your content is going to be AR-driven, where the AR scene is essential for the content to be meaningful. Other times, the AR is just an additional display layer, and your content could otherwise be displayed on a traditional screen with no loss in experience. We believe the best experiences are the former, where the AR is spatially aware and is a functional driver of additional content value. In order to really benefit from this transformational quality that AR can bring to content, good AR experiences should provide the player/user with easy-to-understand instructions on how to interact with the AR scene and the content. We believe the following types of user directions are important:
DIRECTING THE AR SCENE
Bear with us and think about movie production for just a moment - each scene has lighting, mic, camera angle requirements, which are controlled by the director. In traditional mobile apps, the developer is the director and the “scene” is basically the device's screen; you know the exact pixel dimensions, layout, etc. of that scene. But AR developers can't necessarily control where their experiences happen, either in terms of physical location or in terms of interactions on the device’s screen:
Your players will interact with your content in different places, like an outdoor park, an indoor office, the inside of your car, the outside of your car, etc. - all of these "scenes" have different weather/lighting/GPS issues involved.
Because the device screen acts much more like a monitor on a camera lens - allowing your AR content to scale in depth and left/right panning as the player moves the device through the scene - you can’t control how much of the device screen is actually dedicated to the AR content versus real world objects.
It’s exceptionally hard, if not logistically impossible, for a developer to control for all arrangements of a scene that their players will encounter. So often, you have to encourage (and assist) your players to become their own directors, to ensure that they're getting the full benefit of the AR scene. Developers have to train their players to be more engaged in the scene - to use the space, to use the camera angles, to minimize lighting or tracking distractions, to understand spatial audio cues, etc. Give players some simple breadcrumbed instructions on this spatiality, but also provide more optional details for those interested in controlling more.
The player should be encouraged to physically explore the ways that the AR content is situated into their real physical environment. But you should always remember that spatiality requires 10x more physical activity than traditional apps. Physical fatigue is a real UX issue that should be designed around - don’t inadvertently put too many interactions above waist height unless your specific intent is to exhaust your player’s arms or physique. Forcing physical movement as part of an interaction or gameplay - while interesting and admirable to get people off their keisters - means that your universe of willing participants will shrink. And all of this physicality is still mediated through the device’s screen; whether that’s through a 4” porthole via a phone’s screen or 10” on a tablet, it’s still a fractional window onto the real world. And the surface of this window is your interaction point for your content - not via gestures out in the space you’re focused on.
AR requires permissions to use many of the device's sensors. These are not granted by default in iOS and each permission for each app is explicitly called by the app and granted/not granted by the user. These permissions should be presented at the right time during the user sessions, so as to maximize opting in to the permissions that are necessary to run AR experiences. Don’t require camera or notification permissions without giving the player some context for why these are important. If your Scene Directions are set up appropriately, it should be easy to get 100% opt-in for your permission requirements.
AR Content Type Drives UX Quality
There are over 300 ARKit apps currently available on the App Store. For this ARUX analysis, we downloaded and tested more than 90 of the top titles. We've grouped them into 14 categories, ranging from the physical (furniture) to the active (sports) to the esoteric (storytelling) (full list can be seen here). We found that generally, the more directly there was a physical tie to the AR content (i.e. furniture), the better the initial setup and experience with the AR scene. Then, generally, as the AR content moved from the physical space towards the more conceptual, the directions on how to interact with the AR space became worse and worse. This varied depending on the publisher’s familiarity with AR of course, but as a general rule, as the AR content was less and less tied to a physical analog, the quality of the instructions provided to the app’s users diminished. This implies an intuitive disconnect between bringing these content types into an AR layer - if the developers, who are taking a deep dive on their AR content and understand it in more nuanced ways than others, can’t even relate how-to instructions properly, it is unlikely that users will intuitively understand how to best experience or utilize that AR layer. Listen, we don’t need AR just for AR’s sake. We need experiences that are qualitatively changed by presenting content in new contextual or spatial ways. We need new ways of engaging with our devices and thinking about our spatial surroundings. Pulling content off of a 2D plane (paper, screens, etc.) and into our native 3D world means that our education, utilities, decision-making, and entertainment are in the process of a transformational shift, and even these beginning ARKit experiences need to embrace that shift, as opposed to just repackaging traditional content and slapping it into space. Part and parcel of that new spatial awareness is bringing your users into alignment with how to experience it.
The fact that the better introductions and how-to cases are often presented by developers of experiences tied to a more physical analog also provides some insights about the UX of AR. We all know how to sit on a chair. We know where a “chair” fits into our world - we know how to physically move it, we know what qualities we like about certain types, we know how chair design impacts its surroundings, we can recognize even a dim outline of one in twilight, and we understand the physics of how a chair sits on the floor or ground and whether it might topple over with us in it. So it should come as no surprise that furniture visualization tools are the easiest early experiences to understand in the ARKit environment. ARKit’s floor-plane driven experience starts with gravity and the floor as the primary building blocks of the experience. The further the experience moves away from that grounding metaphor - from embedded gestalt principles of comprehending the physical world - the harder it will be to present an understandable user experience, both in terms of content consumption as well as in term of simply instructing your users how to experience it.
Poor UX in Current ARKit Apps
We would prefer a world where no apps appear in this category. And we apologize for singling out those experiences that we don't think meet our own (unreasonably high?) expectations, but without examples, how will the community learn? So please don't look at inclusion of these apps as us pointing the finger at these devs to shame - rather, these are simply examples of what we wouldn't do or wouldn't have done, if we were in their shoes. And because of the publisher’s size or prominence in their respective industries, A) criticism here isn’t going to affect their bottom line, and B) these companies should have known -and spent- better, and C) despite their industry presence/brand, these experiences shouldn’t be used as examples by smaller shops, regardless of how popular they may (or may not) be with the public. Here, then, is a small collection of apps that have attempted to bring their users up to speed on how to work with AR, but for some reason (stated below), they miss the mark.
CHALK by VUFORIA
Honestly, we were very excited to see how Chalk would present AR to its users. After all, Chalk comes from AR powerhouse Vuforia, which until ARKit came out, was THE dominant AR library in use across the market. (Mea Culpa: Chalk is not an ARKit app, but was released at the same time, and should have provided good example on how to use AR.) We played with the early Chalk prototypes at AWE 2017 and the app is getting a lot of good PR exposure. However, the user experience is... to put it mildly... shitty. The onboarding session is comprised solely of forms and terms and agreements and even an email verification, and after you slog your way through those unexplained steps, you can't even see or experience the app's content or functionality unless you've connected with someone else through the app. What a waste of our time! From there, you're stuck and by this time, we've forgotten what Chalk is supposed to do for us. So technically we can't even say the AR experience is good or bad - it's not there, so this is more of a traditional UX failure than a clear AR UX failure, but the AR is absent, and that's the whole point. FORM -> OUT OF APP ACTION -> PERMISSION -> PERMISSION -> PERMISSION -> SPAM YOUR CONTACTS -> NO AR is just a terrible sequence.
PORSCHE AR IMAGINE
We were similarly interested to see what a luxury monied brand would bring to the UX of AR. Unfortunately, this seems to be one of those examples of bad timing in the steps of the onboarding sequence and is one of a myriad of examples in the ARKit category where camera permissions are the very first interaction with the app, without any explanatory context. Many of these apps just assume that users will understand that camera permissions are required for AR to work, and just because this is an AR-first app, then those permissions will be granted. In our opinion, users need to be given some context about the role of the camera in the AR experience.
The AR-specific instructions at the end of this sequence are what we consider to be rudimentary-to-basic: some context about the spatiality of the scene, but no appreciation for the scene sizing (it's going to place a life-sized vehicle in your space but there’s no warning) and little instruction on improving the experience. Overall, PERMISSIONS -> TERMS -> BASIC INSTRUCTIONS -> UNEXPECTED SCENE SIZING is a subpar UX.
HOLO by 8i
Holo is a camera app published by volumetric video company 8i. They’re a venture funded B2B software company and this consumer-aimed app is an attempt to build awareness and use of holographic video. The artwork design is great, but the UI and UX are really subpar. To start, getting from install to even the beginning of AR content takes over 40 seconds of really useless interactions. There’s an age requirement, permission screens, and “latest” details - but very, very little in the way of AR instructions, and the little that IS there occurs in the very beginning, so by the time you reach the scene, you’ve forgotten the 1-2 instructions they provided - not that they were terribly good to begin with. Context is king with AR! Panning around doesn’t reveal any actions to the user, either - no indication that any scene detection is happening at all, and there are no onscreen prompts for how to build your AR environment. It appears that the experience must be targeted at the Snapchat demographic - a userbase that normally tries to figure out functions without any onscreen prompting - because only when you mash the screen trying to make anything happen does the plane detection indicator appear. AR requires more prompting than simple, traditional mobile content, so going the Snapchat directionless route is really not yet a viable option for AR experiences in our opinion. And then there’s the AR content… ahem.
ARKit is still in its public infancy. So, just like the early App Store, many of the apps on the market are copycats of one idea or another. The biggest instance of this copycatting is with measurement apps. In our spreadsheet tracker, we identified at least 39 measurement apps and for the most part, these apps worked very similarly, with little instruction or guidance on how best to effectuate the measuring scene. We found 2 positive outliers in this group (which we will identify in our subsequent post on Good ARKit UX). But for the others, despite having such a clear tie to the physical environment (it seems like “measuring” should conceptually go hand-in-hand with furniture visualization experiences, and so should be almost as adept at helping users understand their AR space), these measurement apps did nothing to help their users understand their environment and how to best interact with the AR measurements. Rather, this segment of apps just seems to be the low-hanging fruit of ARKit experiences, and maybe the entrypoint for dev teams to expand their experimentation with AR experiences. These apps feel like the flashlight apps of 2010.
Let me reiterate that this post is not about calling out Porsche, 8i, and Vuforia - this is about helping AR developers to recognize bad user experience, despite the source. Steering AR content through an ideal experience is hard to get right, and an important skill for any AR dev team to build, including ours, is being able to set the stage for proper content interaction. We’ve never gotten the experience 100% right, but over the past 5+ years of AR development, we’ve certainly tried a huge number of ways to introduce people to what AR is and can do. Now that ARKit is probably the world’s most successful AR tech stack, many more people are going to use and play with AR, both for its functionalities as well as its transformative properties for content. Help them - don’t assume they’ll know how to operate this new digital-spatial world. Focus on clearly defined interactions and experiences that are not physically demanding. Ask for permissions at the right place and time so that users understand why they’re granting you access to their camera and other capabilities. And help people get to the AR as soon - and easily - as possible. Again - help them.