Google’s AI try-on now works with just a selfie

Google's AI try-on now works with just a selfie - Professional coverage

According to TechCrunch, Google announced on Thursday that it is updating its AI try-on feature to let users virtually try on clothes using just a selfie. Previously, the feature required a full-body picture, but now it uses Google’s Gemini 2.5 Flash Image model, called Nano Banana, to generate a full-body digital version from a selfie. Users select their usual clothing size, and the feature generates several images for them to choose from as a default try-on photo. The new capability is launching in the United States today. Google first launched the try-on feature in July across Search, Google Shopping, and Google Images, and it also has a separate app called Doppl dedicated to AI try-ons.

Special Offer Banner

The Selfie Economy

Here’s the thing: lowering the barrier to entry for virtual try-ons is a huge deal. Asking someone to dig up or take a decent full-body photo is a friction point. A selfie? That’s easy. Everyone has a million of them. This move basically turns any casual browsing session into a potential fitting room. It’s a smart play to capture more engagement and, ultimately, more shopping data. And that data is pure gold for training these models to be even better. But it also raises the obvious question: how accurate can a full-body model generated from a face shot really be? The fit and drape of clothing is so dependent on individual body shape. I think this is probably great for getting a general vibe of a color or style on you, but I’d be skeptical about it nailing the precise fit for, say, a pair of tailored trousers.

google-s-shopping-ambitions”>Google’s Shopping Ambitions

This isn’t just a cute feature update. Look at the broader context Google is building. You’ve got the main try-on feature in its core shopping surfaces, the dedicated Doppl app that just got a shoppable discovery feed, and now this selfie simplification. They’re coming at the virtual try-on space from every angle. The new feed with AI-generated videos of real products is a direct shot across the bow of social commerce platforms like TikTok and Instagram. Google wants to be the place you discover *and* confidently purchase, cutting out the middleman. If they can make the try-on experience good enough, it directly attacks one of online fashion’s biggest pain points: returns. Fewer returns mean happier merchants, which means more inventory in Google’s shopping graph. It’s a virtuous cycle they’re desperately trying to spin up.

Winners And A Lot Of Questions

So who wins here? Big, data-hungry AI companies like Google, obviously. Merchants who get featured in these try-on feeds and see lower return rates could be big winners too. The losers? Maybe smaller, pure-play virtual fitting room startups that can’t compete with this being baked directly into the world’s largest search engine. But let’s not gloss over the big questions. What happens to that generated body model of you? How is that data stored or used? And while Google offers the option to choose from diverse body type models, which is good, the selfie-to-body tech will need to be incredibly responsible to avoid reinforcing biases. This is moving fast. The convenience is undeniable, but the implications, as always with AI, are sprawling.

You can check out the try-on feature here, and Google has more details in its blog post.

Leave a Reply

Your email address will not be published. Required fields are marked *