Building compelling user journeys in your AR projects is vital for ensuring your users know how to interact with your experience from start to finish. In this post I sat down with Zappar’s creative content developer Tom Youel to discuss how ZapWorks users can apply UX principles and game mechanics to their augmented reality projects.

 

Building compelling user journeys in your AR projects is vital for ensuring your users know how to interact with your experience from start to finish.

In this post I sat down with Zappar’s creative content developer Tom Youel to discuss how ZapWorks users can apply UX principles and game mechanics to their augmented reality projects.

 

 

By David Mather, Marketing Manager - Zappar

 


 

DM: Hey Tom! When developing for augmented reality, what are you looking to achieve from the final experience?

TY: With augmented reality, you are looking to add a new dimension to the physical item that didn’t exist before. A great example of this was for Bombay Sapphire, designed by Anna (Broadhurst) here at Zappar. Scanning the bottle produces a garden of flowers and wildlife blooming out of the label. It complements the bottle perfectly, which is a beautiful design to begin with, by providing it with this stunning “hidden” digital dimension for someone to find.

 

 

 

 

DM: How big a part does the user journey play in the creative development of AR projects?

TY: A big part of creative development in AR is producing the journey the user takes. If we have an experience that tracks a physical item, like the front of a cereal box, then we need to subtly guide the user towards aiming their device’s camera towards it to begin the experience, upon which we usually have a reveal that segues into a valuable piece of content, like a game. This journey has to be carefully managed, particularly around tracking to images, to ensure the best possible user experience (UX).

ZapWorks Studio facilitates this nicely by providing event listening and state management, allowing the developer to account for situations wherein the user’s device loses tracking, and also ensuring all aspects of the experience can be neatly encapsulated away within symbols.

Augmented and mixed reality (AR and MR) are entirely new to the majority of people, lacking many of the conventions that users in the games industry are familiar with. As such they require quite a lot of hand-holding. The expected behaviour from users is very specific with AR and MR. Just as you would guide a new player to a game by signposting their expected actions and offering them a steadily-increasing amount of responsibility in the game world, you have to perform a similar feat when designing for AR and MR. Both fields offer opportunities for completely immersing a user's attention, so in that sense they are different platforms for expressing a similar outcome.

 

DM: Talk to me about signposting a user's journey within AR experiences. How do you map this out at the start of a new AR project?

TY: Nearly all of our experiences require a front-loaded prompt, instructing the user to look at the packaging image or Zapcode the experience launches from. This is because in between scanning a zapcode for the first time and the content package loading in, the user is likely to lower their device and look away during the loading period. As we can listen for “seen” and “notseen” events from the target image file in ZapWorks Studio, we are also able to show this prompting throughout the entire experience, which is often a useful pattern if all of the content is oriented onto the target image. Alternatively, we can “reparent” the content onto screen in the event the device can't see the tracking image.

If an experience has a combination of tracked and “non-tracked” content, the flow state can sometimes become unsynchronised, and bugs can occur whereby a prompt is appearing when it shouldn’t be, for example. There is a powerful yet simple set of design patterns built into ZapWorks Studio to solve this problem, so state changes in the experience can be listened for and updated appropriately to their context.

 

DM: You recently worked on a new mixed reality experience for Supercars and Unibet. Give me a quick synopsis of the experience.

TY: It’s an immersive experience based around the activity of assembling a muscle car engine, which was displayed in a booth at a Supercars race in Australia. The virtual engine appears in front of you, with pieces separated out, like a puzzle. Using the ZapBox controllers, users must rapidly connect each piece to the correct place on the engine, assembling it into a working, revving model. Assistants at the event instruct and guide users, recording the fastest times on a (physical) leaderboard.

 

 

DM: What additional considerations did you have to keep in mind when creating such a physically immersive experience, compared to say developing for augmented reality and phone screens?

TY: Mixed reality often presents a more physical and immersive experience than augmented reality achieves. The main difference between developing for the two is that mixed reality exists as an environment. Instead of buttons on a screen, you have an object that takes up physical space in a room. Signposting intent to the user is massively important, and mixed reality therefore requires a greater depth of understanding of UX in order to communicate how interactive mechanics work, and in order to successfully suspend disbelief. It’s fun to design for, but requires a lot more thought.

 

DM: How do you make sure users interact with your experience or product in the way you want them to?

TY: The best design doesn’t require reams and reams of text to tell you how to use it. Users by and large do not read text tutorials. There is a whole field in design called affordance which is about “intuitive” design when interacting with everyday objects - for example, you know to push a door which has a flat panel instead of a handle, since there is no alternative, and you already know how a door works.

You can produce a similar effect with digital experiences like games, whereby eliminating superfluous options leaves the player with visual cues on what they are supposed to do. If the player’s only option is to head to the right of the screen, and their only option is to find a method of getting onto a higher platform, you have implicitly taught them that running and jumping is the core loop of the game.

Touch screen design is a little more involved, as the hardware is not game-specific like a game controller is. But there are certain interactive conventions which have been established since smartphones were brought to the mass-market about ten years ago that users expect. It is your job as a designer to either use or disregard these - and in either case, to ensure the user understands the effect of their decisions within the context of the experience.

 

 

 

DM: A lot of the projects you work on involve a heavy amount of 3D and animation. What considerations do you have to make for 3D when thinking about the user journey?

TY: Mixed reality experiences tend to be entirely 3D, as they have a responsibility to look like they are interfacing with the world, so that affects the amount of information you can expose to the user at any one time. They easily become overwhelming - particularly as the user can only look in one direction at a time. It is hard to determine whether a head movement is intentional or not, unlike with designing for a screen.

 

DM: What’s one of your favourite AR experiences you’ve worked on and why?

TY: I was really pleased with the outcome of the Bulmers football project we made with Shazam, which is a keepy-uppy game tracked to a beermat. We got the central game mechanic to be very challenging, rewarding and addictive.

Also from a strategic point of view I love the thought of targeting people during “solo moments” at the pub whilst they wait for their friends, and filling that with time with a really fun game.

 

 

 

DM:  That's interesting, how should designers go about making the central game mechanic challenging, rewarding and addictive?

TY: You create a core loop in the game which is simple enough for any user to pick up, but challenging to master. This is the “verb” of the game - what the player is doing during general gameplay. This could be tapping to hit a power meter at the right point, or swiping to avoid obstacles at the right time. Once you’ve accomplished that, you produce a mechanic that showers the player in positive feedback relative to how closely they’ve mastered a certain action. You then escalate the game’s challenges as the player moves forward, so that they stay within the positive “flow” zone between anxiety and boredom.

The player should fail a certain number of times, and when they fail, it should be transparently obvious how exactly they are responsible for failing, so that they can correct the action and improve: ideally, they are back into the core loop within seconds. Making the decision to dramatically punish a player for making a single wrong move, and then not easily allowing them to improve their strategy straight away, will just encourage an extremely conservative, unadventurous approach to the game. As a game designer you need to think carefully about that.

You also don’t necessarily need a points system, or to make a point of competing players against each other, to produce an addicting game mechanic. Intrinsically engaging core loops are fun to interact with regardless.

 

DM: How does AR enable you to push the boundaries of game design and creative content development?

TY: Just as with the Bombay Sapphire bottle, you can do the same with game mechanics. Interesting packaging designs, that have been around for decades, can be augmented.

We recently proposed a Tom & Jerry game mechanic for the lid of a Müller corner yoghurt. The packaging is designed to tip the fruit into the yoghurt, and we simply took that idea and made it into a game mechanic with Jerry bouncing up and down on a 3D digital version of it, avoiding capture from Tom. Users respond best to AR when it’s put in a real-world context: instead of being just a novelty.

 

 

 

DM: Thinking about game developers specifically, what additional skills or qualities did you have to learn when first joining Zappar?

TY: There are very few differences between the two fields, so the skills are easily transferable. The only major learning curve was figuring out the platform, which is designed from the ground up for augmented reality specifically. Designing for touch devices is the other important aspect, but I had experience with that before Zappar.

 

DM: So how did you hone your skills? When did you learn your craft?

TY: I started out making games and animations in (what was then) Macromedia Flash, after school, from about the age of 14. My dad was a partner in a graphic design firm at that time and had a studio in Old Street, which I would visit every week, and it was there that I found an old copy of Flash, and got to use it for the first time. I could barely make any sense out of it at first, but was totally engrossed by it.

I taught myself how to use it, the hard way - from scratch - and within a few years, and much experimentation, I had produced a portfolio of work and learned (the basics) of programming. Thereafter, I joined the UAL at the London College of Communication, and completed a bachelor’s degree in Games Design. On the course, I gained a huge amount of invaluable knowledge about games design and development, and produced both higher-quality and deeper creative work with each term.

I released a Flash game I made inbetween my first and second years online, which gained close to half a million plays. After I left university, it helped me gain my first job, at Zappar.

 

DM: That’s awesome. What was involved in the game? How did it help you to transition into the working world and land a creative content developer role at Zappar?

TY: It’s a strategy game that took a lot of inspiration from classic board games like Risk and Monopoly. The main challenge is resource management, as you move liquor around late-1920s New York during the Prohibition era as the head of a crime family, and try to win control of each of the city’s five boroughs. Not only that, there are numerous city factions (police, politicians, dock workers, etc.) and the other four rival families, who each have their own interests to attend to.

The scope of the game made it pretty overwhelming for a number of players, which led me to subsequently explore extensively how design cues and affordance can influence how players learn and make informed decisions. The game front-loaded a lot of information, which is something I learned to make a more gradual process.

More than anything else, it taught me the dedication to see a project through to fruition - especially of that scope. That helped in the working world more than anything, as it teaches you to problem-solve constantly, as well as knowing what is possible in a given time frame.

 

DM: What advice would you give game devs wanting to make the switch or try their hand in the AR/VR/MR space?

TY: Pick up the software and dive in. You can only learn by making mistakes. Remember that you are designing for other people, so take your ego out of the equation and learn how to listen. ZapWorks Studio is pretty easy to pick up even if you don’t have a background in Javascript (and even quicker if you have a basic understanding). Start a 30-day free trial and roll your sleeves up. 

 

Ready to start creating your own AR experiences?

Begin your 30 day trial of ZapWorks today.

 


 

Related links:

Tom’s portfolio

Showcase your best work on the ZapWorks forum

Designing AR experiences that add real value

Assembling the AR dream team