When you’re fully invested in creating your next AR masterpiece, it can sometimes be difficult to see beyond your own vision. After all, when a project is carrying your name, you want it to look and behave in the manner you’ve always imagined - that’s why you tweak, iterate...and end up putting on another pot of coffee.
But, particularly as a solo or small team creative, putting yourself firmly in the shoes of your audience is an important challenge to overcome. AR opens the door to a wide range of creative possibilities, but as the technology becomes increasingly normalized, the amount of user entry points grows exponentially. It begs the question - if your carefully crafted experience works wonderfully on a top of the line smartphone in your studio with super-fast broadband, can it still be enjoyed by somebody on the bus with an older gen phone on mobile data?
That’s where optimization comes in - and it’s a crucial part of the design process here for our internal HQ creative team. Working with some of the world’s biggest brands means catering to a wide audience, with all the devices, contexts and levels of tech-familiarity that entails. That’s why I’ve turned to our Senior AR Developer, Chris, to talk about how to rise to that creative and technical challenge. At Zappar, we firmly believe that AR is far too awesome to just be experienced by the few, so let’s explore how to make content that everyone can enjoy.
Senior AR Developer, Chris Parker, explains how to optimize your AR experiences
Dave Mather: Hey Chris, what’s your background? Can you tell everyone what you do at Zappar?
Chris Parker: I come from a game design background, having graduated from UCA Farnham’s Computer Games Arts course back in 2016. I joined Zappar as a Content Developer not long after as part of the ‘Scene’ team, meaning I utilise Zapworks Studio to design AR experiences for our brand clients. After almost three years here, I recently became our Senior AR Developer, which brings some additional oversight responsibilities - though I’m still fundamentally creating AR experiences every day.
During my time here, I’ve worked on branded AR experiences for such a diverse range of clients with wildly differing audiences and objectives, operating in a wide array of worldwide markets. So optimizing AR experiences to suit different contexts is a huge part of what I do.
Exploring what optimization means in the AR development context
DM: When talking about optimizing an AR experience, what are we really looking at here?
CP: When it comes down to it, there are two main things that we are trying to limit as AR developers - RAM (Random Access Memory) usage and download size. It’s certainly true that smartphones are getting more powerful all the time, but they are still not as powerful as most consumer PCs. Though like PCs, we do have an array of models, operating systems and so on to consider. It’s a big positive in terms of wider access to AR, but it does present a design challenge when you’re trying to create something that’s aesthetically pleasing but also functional across different devices.
RAM refers to the amount of memory your smartphone has access to at any one time. As AR creators, we’re trying to ensure that any experience isn’t too complex that it exceeds a device’s RAM, which would end up crashing the phone.
Meanwhile, download size is really about content and ensuring that we’re not eating into users’ data usage. This is particularly true if they’re not able to access WiFi. One of the most important parts of my work is thinking about when and why a user is going to be accessing an AR experience.
So in really broad terms, you could create a highly detailed, fairly weighty AR experience that performs great in an isolated context - such as an event where there’s a definitive call to action, an obvious tracking image and stable internet connection. But would that work for a piece of connected packaging where someone could be scanning in a supermarket, or on the bus home?
The user context has a huge impact on the likelihood of scanning, so designing with that front of mind and optimizing accordingly is really important.
DM: Thinking about UX specifically, how important a part does optimization play in good vs bad UX?
CP: So much of what we do is about capturing the imagination of a user and pulling them into our AR world. If an experience takes more than a second or two to load, we risk losing them.
What we’re really talking about here is attention span. If you read any UX blogs, you’ll know that whatever medium you’re on, be it a PC, a mobile browser or an AR scanning app - you have a finite window to capture a users attention (and keep them engaged).
Collaborating with PEZ meant optimizing experiences for a diverse audience with a wide range of devices.
Optimization means quick load times and a quick load time ensures users get to the reason they scanned the experience in the first place - the content.
This necessitates rapid onboarding - so you’ve got to combine a swift loading time with intuitive UI and easy-to-understand copy so the user near-immediately knows how to proceed. I know that’s something we’ve covered before in Lucas’ blog on intuitive UX/UI design for AR, so there’s a lot more detail on that there.
DM: Have you ever come across a poorly optimised AR experience? What was the effect on the UX?
CP: I’ve spotted a few in my time! In most cases, poor optimisation is fairly easy to avoid, whether you’re creating in ZapWorks Studio or any of the other platforms.
The most common issue is probably experiences that visibly ‘chug’ and run poorly because they’re really resource intensive on the user’s device. That’s particularly true if someone’s designed and tested using a top of the line smartphone, while not considering that a broader audience may be using an iPhone 6, say.
Visible jagged edges on assets can also be a tell-tale sign of poor optimization. If creatives do not do their downsizing in-scene within Studio, it means their images get sized down once the experience loads up, meaning it comes out looking unpolished.
Overall though, the most obvious sign of a poorly optimized experience is actually the AR experience not loading up because the user simply becomes impatient. As a content developer, I empathise with the desire to use the best quality assets on offer, but this can be fatal to the AR user experience. Highly detailed, high polygon count imagery leads to a massively inflated download size, which means a big barrier to entry for users (particularly those on the move) and can have a significant impact on device performance.
Sometimes you’ve just got to take a step back, consider your user and their interactions. In the vast majority of cases, users want fun and engaging content in digestible chunks - they aren’t going to be zooming in deeply to assess your imagery. Users care about good content, not poly counts!
Optimizing your AR experience with Zapworks Studio
DM: How has Studio been built to make optimizing AR experiences as simple and straightforward as possible?
CP: Studio has a feature which enables you to analyse the final download size, giving you a breakdown of all the assets used in your project in descending size order, as well as notifying you if they’re currently being used directly in the hierarchy. It’s a really useful way to spot bottlenecks or things that could potentially be optimized.
Studio's 'analyse final download' feature enables you to spot bottlenecks quickly and reduce download size
The feature is particularly useful for spotting unused assets that are still being counted in your overall download package. As you develop, you’ll often make new iterations of different assets and it’s all too easy to leave in older, unused versions without realising. The analyse download size feature isolates these for you, which is a massive help.
You can also re-encode images directly within Studio through a quick right click in the image library. That means you can re-optimize imagery on the fly, rather than having to load up a photo editing tool and import the image back in.
Another really useful bit of functionality is that Studio recognises duplicate files, so end users only download each file once per experience. This is due to get even better with Studio’s global content cache functionality - it’s still in beta at the moment, but that de-duplicates across all content so users accessing different experiences will see smaller download times across the board.
By entering 'image properties', Studio users can re-optimize imagery in-scene to save time
How to optimize images, audio and textures in your AR experience
DM: Let’s quickly touch on some quick wins for ZapWorks users. What are some of the first places they should look to optimise their AR experiences?
CP: A great place to start is your 2D assets, which are more often than not your images. It’s tempting to use the PNG format in this context because of it being lossless, but JPG often makes a lot more sense for AR experiences.
Generally, exporting images at a lower resolution is not a hurdle for good quality AR because the difference in quality is barely noticeable, if at all, if you’re dealing with smaller assets. PNGs hold onto data that maintains the image quality whereas the JPG let’s it go. This logic extends to utilising a standard material to mask textures (such as a JPG and a JPG mask) as opposed to a PNG with transparency. That ends up with a far lower file size.
Conversely, saving your single colours as a PNG is actually a lot more optimized because they are only communicating a simple set of information. A JPG will try and optimize what it can see, but there’s no need for that in this context.
DM: And how about audio?
CP: Depending on the content, i’d say the best audio quick wins are:
Converting the audio to mono - this may not feel like a natural move, but unless your AR experience requires the use of headphones, you’re often not going to make the most of multi-channel audio.
Choosing OGG over WAV where OGG is smaller - OGG is lossy but with phone speakers, most of these optimisations won’t affect the quality of the sound in a noticeable way.
There are other things you can do to optimize further, but they can be more complicated to implement which sometimes outweighs the benefits you receive from it. For example, streaming mp3s instead of shipping the ogg/wav files with the experience can cut down on your package size, but at the expense of a significant amount of work. So it’s about weighing up the costs and benefits.
And of course, as long as it suits your experience, you don’t necessarily have to even use a full song. If you’re able to analyse your chosen music and seek out a loopable segment, you can get it seamlessly repeating with a bit of basic scripting.
Audio looping with a bit of basic scripting is a great alternative to using a weighty full audio track for background music
Likewise, experiment with exporting your audio files at lower bit-rates. As people are going to be using an experience via their mobile phone in relatively small doses, users are unlikely to be making the absolute most of a higher bit rate audio file. Keep previewing your project with different bit rates until you find a balance where you’re happy with the compromise between audio quality and file size.
DM: How about textures? Do they affect the overall package size at all?
CP: Textures certainly can affect the package size, so it’s about applying these elements with efficiency in mind. From a user perspective, think about how far their distance is from the object you are creating for the AR experience. While I empathise with designers’ desire to make everything look as amazing as possible, thinking efficiently, don’t waste image detail on objects users will always be further away from. Prioritise the objects users are going to be interacting with up close and personal - don’t waste valuable package size on intricate textures that users will not get to enjoy.
DM: More often that not, Zapworks users will be building AR experiences without knowing the user's end device. Is there a process you use for testing AR experiences?
CP: It’s best practice to try the experience on as many devices as possible but when that isn’t an option there are a few things you should make sure you prioritise.
Firstly, if you only have limited access to multiple smartphones, as a minimum requirement make sure you test on at least one iOS and one Android device. There are nuances between different models and OS versions, which can affect how your experience is displayed, but testing an example on each is at least a good start.
Secondly, if you’re a ZapWorks user, make sure you utilise the Device Simulation settings in Studio. Set it to the thinnest phone setting possible and check the alignment of all of your on screen content (particularly your UI), and then change it to the widest phone setting and do the same.
Don’t forget, if you want to build on this testing but don’t have access to an internal or formal QA team, try reaching out to your own personal networks. Colleagues, friends and family can all be super useful when testing optimization and I’d recommend asking (nicely!) for people with a diverse range of devices and technical experience levels to briefly play around with prototypes of your AR experience. How does it perform on their device, are they getting into the experience intuitively?
DM: Is there a prefered final download size you’d recommend?
CP: Basically as small as you can, but we tend to try and get all our experiences down to below 5MB whenever possible.
There’s always a balance to be struck between creating high fidelity experiences and user accessibility. In terms of improving user experience, it’s important to remember that while AR experiences are getting more and more popular, it’s still a relatively new form of technology for lots of people. High download size means a longer waiting time, which heightens the risk of a user getting distracted or frustrated. After all, if you’ve created an incredible, hi-res experience, does it really matter if the user is too impatient or confused to be able to engage with it?
The user context needs to be considered massively here, too. For example, you may be able to get away with a higher download package size if the experience is designed specifically for an event that’s curated, with access to good quality wi-fi. But in many cases, we’re talking about experiences that we want people to be able to access easily, wherever they are. So you’ve got to ask yourself, will my creation download quickly for a user while they’re on the train with limited data? Or while trying to show it off to a friend at work without Wi-Fi?
Zappar's QA team get to grips with a new AR experience, making sure experiences run smoothly on a wide range of devices
AR optimization in action
DM: What are some of your favorite experiences you’ve worked on and why?
CP: One I particularly enjoyed working on was our Yorkshire Tree experience for the brand Yorkshire Tea (yes, even as a Lancashire lad…), because it opened a lot of creative possibilities for working with the target image. So as this was paired to a print advertisement in magazines, we were able to optimize the user experience by making full use of this large plane, enabling users to drag trees around and plant seeds. It was visually really engaging and fit the user context well.
The 'Yorkshire Tree' experience for Yorkshire Tea was optimized to make full use of the large tracking image.
I did a lot of work on the BBC x The Open University ‘Secrets of the Human Body’ AR experience and that was really exciting. Part of that was because it was getting to work with such a recognisable, iconic institution like the BBC and creating content that matched their educational remit. But also because it was a really good example of what we like to call a ‘snackable’ experience. Optimized to be detailed but quick to load, intuitive to interact with and easily digestible - I feel we put together something well balanced that was ideal for an informative experience.
Secrets of the Human Body' was an AR experience that achieved a high level of detail with a relatively small package size
Finally, I think I’ll go with our Glenlivet collaboration. This was a smaller experience, but I liked how it was really sensitively designed as a piece of connected packaging that elevated the product. As a whisky, it’s something to be savoured and the iconic bottle is going to be in someone’s home almost like a piece of furniture - and I feel like the experience really played into that by enhancing the box art.
The Glenlivet' was one of Chris' favourite connected packaging experiences, optimized to enhance the box art and give real value to customers.
The top 5 ways to optimize your AR experience
DM: So to re-cap, what would be your top 5 ways for creatives to optimize their AR experiences?
CP: These would be the key things I'd recommend...
Final download size - Keep it as small as you can and aim for 5MB max - a slow load can really put off a user.
Remove deprecated assets - You don’t need them so don’t make users download them. Use Studio’s features to remove them before publishing.
Test on a wide range of devices - Make your experience as accessible as possible by testing on older but ubiquitous smartphone models.
Dig deep into docs - There’s a wide range of helpful documentation for Studio users that deals with optimization - so get digging!
Context, CTA, Content - Remember the Zappar Three C’s! Users care about good content, not poly counts, so optimize your experience by putting yourself in your users’ shoes.
Want to learn more about optimizing your AR experiences?
Become an optimisation pro by exploring our documentation, featuring in-depth walkthroughs and video tutorials on optimizing images, video, audio and 3D models for your AR experiences. If you’re curious, why not reach out for tips and advice from our friendly community of AR enthusiasts on our Forum? It’s a great place to share ideas and receive feedback for your projects, with our active Technical Support team on hand whenever you need them. We’re always excited to hear about prospective AR projects, so if we can support you on your AR journey, then please get in touch.