If you've shopped on Zara's app recently, you might have noticed something new: a virtual try-on feature that lets you see selected items on a digital model. H&M has been experimenting with similar technology. ASOS introduced their "See My Fit" tool. Google has rolled out AI try-on across its Shopping results. Even Amazon has entered the game.
The message is clear: virtual try-on isn't a gimmick anymore — it's becoming standard. The world's biggest fashion retailers are investing millions in technology that lets you see clothes before you buy them. And for good reason: it's the single most effective way to reduce returns, increase customer confidence, and solve the fundamental problem of online fashion shopping.
But here's what nobody tells you: most of these brand-specific tools have significant limitations. Understanding what they offer — and what they don't — is the key to actually making better purchase decisions.
What the Big Brands Are Doing
Zara
Zara has integrated virtual try-on into their app, allowing customers to see selected garments on AI-generated models with different body types. It's a step forward from static product photos, but the experience is limited to Zara's own catalogue and a set of pre-built model types rather than your actual body. You're seeing how the garment looks on a body like yours, not on yours.
H&M
H&M has experimented with AI-generated model imagery and body-inclusive product photography, showing garments on a wider range of body types. Their approach focuses on representation — seeing models that look more like you — rather than personalised virtual try-on. It's an improvement over the traditional single-model approach, but you're still viewing someone else's body, not your own.
ASOS
ASOS's "See My Fit" technology shows items on models of different sizes, helping customers visualise how a garment might look on their body type. The feature covers a range of sizes and provides a more realistic preview than standard photography. Like H&M, it's about representation rather than personalisation — you choose a model that's closest to your build, but it's not you.
Google Shopping
Google introduced an AI virtual try-on feature that shows garments on a range of real models across different body types, skin tones, and sizes. When you search for clothing on Google, you can see how items drape and fit across diverse body types. It's perhaps the most advanced brand-agnostic approach from a major tech company, but again — you're viewing real models, not yourself.
Amazon
Amazon has tested virtual try-on for shoes and accessories, using AR through the phone camera. For clothing, they've focused on AI-generated "outfit inspiration" and body-type filtering rather than full garment try-on. The approach is practical but limited in scope.
The Pattern: Better, But Not There Yet
All of these initiatives share a common thread: they're improving the product photo experience by showing clothes on more diverse body types. This is genuinely valuable. Seeing a dress on someone who's a size 14 rather than exclusively on a size 6 model is a meaningful improvement for millions of shoppers.
But they all share the same fundamental limitation: you're still looking at someone else's body.
A model who's your size and height will still have different proportions — different shoulder width, different torso length, different hip shape. The way a blazer sits across their shoulders won't be identical to how it sits across yours. The point where a midi skirt hits their calf won't be the same point it hits yours. Close, perhaps. But not the same.
And they're all locked to a single retailer's catalogue. Zara's try-on only works for Zara. H&M's only works for H&M. If you're comparing a jacket from three different brands — which is exactly the kind of informed shopping that reduces returns — you need three different tools, each with different capabilities and limitations.
The future of virtual try-on isn't brand-specific. It's universal. You should be able to try on any garment, from any brand, on your actual body — not a model who happens to be your size.
What Universal Try-On Looks Like
This is the approach that platforms like Adorna have taken from the start. Rather than building try-on for one brand's catalogue, Adorna works across every brand because it starts from the other direction: your body.
You upload a photo of yourself once. That becomes your digital twin — an AI-accurate representation of your specific proportions, posture, and body shape. Then you bring any garment to it:
- Paste a product link from any online store — Zara, H&M, ASOS, COS, Uniqlo, a local boutique, anywhere
- Upload a photo of something you spotted in a magazine, on Instagram, or on a friend
- Snap a photo in-store — see the garment on your body without using the fitting room
- Use the Chrome extension while browsing any of 28+ supported retailers for instant try-on
The difference is fundamental. Brand-specific tools ask "how does our product look on a body like yours?" Universal try-on asks "how does any product look on you?"
Why This Matters for How You Shop
Cross-Brand Comparison
Smart shopping means comparing options. You've found a navy blazer you like at Zara, a similar one at COS, and another at Arket. With brand-specific try-on, you'd need three separate tools (if they even exist) to compare how each one looks. With universal try-on, you see all three on your body, side by side, in minutes.
This is how you find the one that actually flatters your frame — not the one that looked best on a model you'll never meet.
Mixing Brands in One Outfit
Nobody dresses head-to-toe in one brand. Real outfits are assembled from pieces across your wardrobe — Zara trousers, a COS knit, shoes from an independent brand, a vintage jacket. But brand-specific try-on can't show you that assembled outfit. Universal try-on can.
When your wardrobe is catalogued digitally, you can combine a new Zara blazer with the trousers you already own from Mango and the shoes from ASOS — and see the complete outfit on your body before buying the blazer. That outfit-level context is what prevents the "it doesn't go with anything I own" return.
In-Store and Online, Seamlessly
Brand tools are typically locked to the brand's own app or website. If you're in a physical Zara store, you can use Zara's tool. But if you're in a multi-brand department store, or browsing a high street with shops from different retailers, brand-specific tools can't help you.
A universal tool works everywhere — snap a photo of any item, any brand, any store, and see it on yourself. The context you're shopping in doesn't matter. Your body stays the same.
What to Look for in a Try-On Tool
Not all virtual try-on is created equal. As more brands and platforms launch their own versions, here's how to evaluate what actually helps versus what's marketing theatre:
- Does it show YOUR body or a model? The gold standard is your actual proportions. Pre-set body types are better than nothing, but they're an approximation.
- Does it work across brands? Brand-locked tools are useful for that brand but don't help you shop broadly.
- Does it handle different garment types? Tops, trousers, dresses, outerwear, and layered outfits all require different AI capabilities. Some tools only handle simple items.
- Can you combine with items you own? Try-on in isolation is helpful. Try-on in the context of your existing wardrobe is transformative.
- Does it work in-store too? If it's online-only, you miss half the shopping experience.
- Is it fast enough to be useful? If results take 30 seconds, you won't use it. Under 10 seconds keeps you in the flow of shopping.
The Bigger Picture: Why Retail Is Moving This Direction
Virtual try-on isn't just a consumer convenience feature. It's an economic imperative for the fashion industry.
Returns cost retailers between £5-15 per item to process. With return rates above 30% for online fashion, that's a devastating drain on margins. Every percentage point reduction in returns translates directly to profit. Virtual try-on is the most effective return-reduction technology available because it attacks the root cause: the gap between what you see on screen and what you get in person.
Zara, H&M, and the other early movers understand this. Their try-on features — even in their current limited forms — are already reducing return rates for products where they're enabled. As the technology matures and becomes more personalised, it will become as standard as product photography itself.
The question for shoppers isn't whether to use virtual try-on. It's whether to wait for each individual brand to build their own limited version, or to use a universal platform that works everywhere, right now.
Where Fashion Retail Is Heading
Within the next few years, the trajectory is clear:
- Every major retailer will have some form of virtual try-on or enhanced body-matching technology
- AI personalisation will move from "models that look like you" to "actually you" — the approach Adorna already takes
- Cross-brand platforms will become the default way to shop, because consumers don't shop from a single brand
- In-store and online will merge — the same try-on tool working at your desk, on the train, and in the shop
- Fit data will accumulate over time, making each purchase decision smarter than the last
The brands that are investing in try-on today are confirming that this technology is the future of fashion retail. The difference is whether you wait for them to catch up to what's already possible — or start using it now.
Try On Any Brand, Right Now
Adorna lets you try on clothes from any brand, on your actual body, whether you're shopping online or standing in a store. It works with Zara, H&M, ASOS, and every other brand — because it starts from you, not from a catalogue. Combined with a digital wardrobe and outfit builder, it's the complete shopping companion.
Available free on iOS and the web. Don't wait for every brand to build their own — try everything on yourself today.
