Virtual try-ons made easy with AI and WebAR

Blog Author
5 min read
Discover how AI and augmented reality can transform online shopping. Learn about virtual try-ons, seamless user experiences, and how this innovative concept helps you visualise clothes before you buy.

Do you ever face the same problem as I do when shopping for clothes online and ask yourself the question – will this fit me? Or how would it look on me? This proof of concept aims to solve that issue. Imagine how exciting it would be to visualise clothes on your own body using augmented reality. With the rapid advancements in AI, more models now enable virtual try-ons directly on a person’s body.

 

 

How it works

My 3D model consists of two basic textures mapped onto it, one for the front and one for the back, allowing a full-body view. While this approach does introduce some seams, it simplifies leveraging the AI model. The process involves sending the upper body texture to a server where it is processed by an AI model hosted on Replicate (https://replicate.com/cuuupid/idm-vton).

The AI uses the original texture as input and returns updated front and back textures with the new clothing applied. These textures are sent back to the application, where animations created in Mixamo are triggered.

 

Example: Using an AWE shirt as input, the results are two body textures reflecting the new clothing.

 

 

Using Mattercraft’s GLTF viewer, you can explore the materials and trigger the two animations I’ve implemented.

 

Thinking about UX

A good user experience is essential. How would a user interact with this system most effectively? Here’s how I approached the key user flows:

 

Taking a picture of any clothing 

Using Mattercraft’s default picture snapshot functionality, I implemented a way for users to capture images of clothing. After cropping the image to match the visible crosshair, the cropped portion is sent to the server with the AI model. Only the relevant section of the image is sent, giving users greater control over what is processed.

Uploading a picture from the device

This feature allows users to upload high-quality images, whether from a webshop, a shared image via WhatsApp, or other sources. A simple form upload integrates seamlessly into the existing flow, sending the image to the server and receiving the processed textures in return.

 

 

Using default clothing

For webshops utilising this application, pre-generated clothing templates can save processing time. These default items come with interactive AR hotspots that display the price. Clicking on a hotspot opens a modal window showing the original piece with a buy button for seamless shopping integration.

 

 

Camera controls

I’ve added functionality to snap pictures within the app, enabling users to share AR views of the clothing with friends. This was achieved quickly by utilising default actions available in Mattercraft.

 

Different user flows

The application supports multiple user flows, offering versatility. I particularly enjoyed working with animations (layers and clips) in Mattercraft. These can be easily triggered and assigned behaviours, such as turning them on or off when a button is pressed. This functionality streamlined development and made the application’s behaviour clear and consistent.

 

 

Try the experience 

If you'd like to give the project a go, you can tap on this link on mobile or scan the QR code below if you're on desktop. 

Virtual_try-on-QR_Code

Conclusion

This project demonstrates how AI and AR can create a dynamic and intuitive online shopping experience, transforming how users try on clothes virtually. I’m excited to see how this concept evolves!

If you’re interested in trying it out, please send me a PM. I’d love to hear your feedback and see how you find the experience!