I’m Alayna Hughes, an artist working with immersive and interactive visuals and sound. In this post, I’ll share the process behind my latest project—an interactive musical tattoo experience that I created and debuted at the 2025 MIT Reality Hack using Designer, Zappar’s no-code WebAR tool.
For years, I’ve been fascinated by wearable technology and how we can integrate music, interaction, and the human body. With a background in music and technology, I’ve worked extensively with interactive systems, including augmented reality (AR). So, when I started thinking about new ways to merge these interests, creating interactive tattoos felt like the perfect next step.
I wanted to design an experience where people could wear music—enabling their tattoos to trigger sounds and animations, making them part of the artwork itself.
The inspiration for this project came from the paintings of Wassily Kandinsky. His use of abstract shapes and colours made me think about how I could create visual elements that serve as triggers for both sound and animation.
I started designing the tattoos in Photoshop, carefully separating each element into layers so that they could be individually triggered and animated so I could export all the layers into After Effects to animate them. For the loops tattoo, I stuck with 2D images since glb does not support vertex animation and I really wanted the images to transform.
Designing “Loops” in Photoshop
For the synth tattoo, I first created the design in Photoshop and then built 3D shapes in Blender to give it depth. I also created textures in Photoshop to use for the material of the 3D objects to give the shapes a bit more of an interesting colour.
Designing 3D shapes and scenes in Blender
Before moving on to animation, I needed to create music that would pair with each tattoo. Since I wanted each tattoo to represent an instrument, I designed custom sounds in Ableton Live.
For the loops tattoo, I created simple hip-hop-inspired beats, allowing users to trigger different rhythmic elements as they interacted with the design. My goal was to make the tattoos feel alive—not just visually but sonically as well.
With the sounds in place, I moved on to animation using Adobe After Effects. I imported the different beats into the timeline and animated each tattoo’s layers to respond to the music.
For example, some elements move in sync with the bass drum, while others react to different instruments. Each visual component had a role in the overall composition, creating a dynamic connection between sight and sound.
I continued this process for each element, ensuring that every animation aligned with the music before exporting them as RGB + Alpha videos for integration into AR.
Animating shapes in After Effects
For the AR component, I chose WebAR instead of a dedicated app. I wanted users to simply scan a QR code and instantly access the experience—without the hassle of downloading anything. Since the tattoos only last for about 3–4 days, requiring an app didn’t make sense.
To develop the experience, I used Designer, a platform that allows for easy AR integration. While Zappar offers multiple tools, like Mattercraft, I opted for Designer because it allowed me to separate each sound into different scenes.
In Designer, I created a scene for each tattoo layer, linking it to the animations and sound triggers. When a user taps on a specific area of the tattoo, the corresponding scene plays.
Once the animation finished, it would return to the home scene, making for a seamless experience. Because not every shape is interactable in the “loops” tattoo, I thought it would be smart to add a bounce action to the interactable shapes on the main scene to help the user.
“Loops” project in Designer
One of the great things about Designer is that it allows testing on both desktop and mobile devices. I highly recommend doing both since elements can behave differently depending on the platform.
Originally, I had a transparent design for the tattoos but realized that this might be an issue when tracking the content to people with darker skin tones than mine. I changed the designs to have bright backgrounds to provide easier, more robust tracking and to fit with the event theme of purples, yellows, blues, and pinks.
Finished prototype of “Loops”
To print the tattoos, I used a company called Tatuatu in Barcelona, which delivered the tattoos quickly and lasted for 3-4 days.
After finalizing the design, I printed 600 tattoos and brought them to Boston for distribution at the MIT Reality Hack. The response at the event was fantastic! The platform connected quickly, and participants enjoyed interacting with the tattoos.
During testing, I noticed that the yellow tattoo had better AR tracking than the pink one, but both were functional. I’m not quite sure why this might be, perhaps there is not enough contrast between the pink and the other shapes.
I also tried out various surfaces of tracking as the human body is not flat. The curved surfaces option did not work well, as it only allows you to make something tailored towards a bottle or can of a certain size, and I couldn’t know where someone would place the tattoo although the forearm, leg, or somewhere accessible was the best option.
During testing, one issue I encountered was with alpha video transparency where my videos were playing back split with a green background.
After trying various formats, I discovered that the bug was caused by enabling screen recording on the scene. Once I disabled this, my videos played as normal, however, this meant that for any documentation of the tattoo, I or someone else would need to screen recording from their phone.
This project was a great learning experience, and I’m excited to explore more designs in the future. I would like to make more AR experiences and sell them, especially creating custom designs for artists and festivals.
Stay tuned for future projects, and if you’re interested in learning more, feel free to reach out!
Check out this video for the making of the project.
You can find Alayna at alaynahughes.com or on LinkedIn here.