Is this the future of matching clothes? Google says it is
Google's virtual clothes-trying tool, part of a wide range of Google Shopping updates coming in the coming weeks, takes an image of a garment and attempts to predict how it will roll, fold, fit, stretch, and form creases and shadows — on a set of real human models in different positions.
Virtual testing is powered by a new artificial intelligence model based on diffusion. Google developed it internally. Diffusion models — the most famous of which are the text-to-image generators Stable Diffusion and DALL-E 2 — learn to gradually remove noise from an initial image that is entirely composed of noise, moving it step by step closer to the target.
Google trained the model using a number of pairs of images, each of which included a person wearing the garment in two unique poses—for example, an image of someone wearing a shirt standing sideways and another image where the model is standing from the front. To make the model more robust (i.e., resistant to visual errors such as wrinkles that look distorted and unnatural), the process was repeated using random pairs of images of clothing and people.
For about a month, US shoppers using Google Shopping can virtually try on women's tops from brands such as Anthropologie, Everlane, H&M and LOFT. A new “Try On” badge is available in Google Search. Men's shirts will be available once until the end of the year.
“When you try on clothes in a store, you can immediately tell if they are right for you,” he said. Lilian Rincon, senior director of consumer shopping products at Google, wrote in a blog post. It cites research showing that 42 % online shoppers believe that models in online stores do not represent the real picture, while 59 % feel dissatisfied with a product they bought online because it looked different on them than they expected.
Virtually trying on clothes is not a new thing. Amazon and Adobe have been experimenting with generative clothing modeling for some time, as has Walmart, which since last year has offered online functionality that uses customers' photos to model clothing.
Google has already tested virtual clothing fitting and partnered with L'Oréal, Estée Lauder, MAC Cosmetics, Black Opal and Charlotte Tilbury to let users search for makeup shades on models with different skin tones. Generative AI is increasingly being used in the fashion industry, and has faced opposition from models who say it exacerbates long-standing inequalities in the industry.
In a blog post, Rincon emphasized that Google chose to use real models — and a diverse selection that spans sizes from XXS to 4XL and represents a variety of ethnicities, skin tones, body shapes, and hair types. However, she didn’t answer the question of whether the new try-on feature will lead to fewer opportunities to photograph models in the future. Along with the release of the virtual try-on feature, Google is also introducing filtering options for clothing searches. Yes, you guessed it, this is also powered by AI and visual matching algorithms.
























