Google has expanded its AI-powered virtual try-on feature to include shoes and sneakers, allowing users to see how footwear looks on them through Google Search and Shopping results. The technology uses full-body photos rather than requiring specific foot images, positioning Google to capture more of the growing virtual shopping market as consumers increasingly seek digital try-before-you-buy experiences.
How it works: The virtual shoe try-on uses the same technology as Google’s clothing try-on feature, requiring only a full-length photo upload.
• Users can select shoes from Google Shopping or search results and tap the “try it on” button to generate a virtual preview.
• Google’s AI replaces the footwear in the uploaded photo with the selected item, requiring no separate foot photos.
• For optimal results, Google recommends uploading “full-body shots with good lighting and fitted clothing.”
User experience features: The system includes convenient navigation and social sharing capabilities.
• Google maintains a “recently tried” library that users can navigate with simple swipes.
• Users can share their virtual try-on looks with friends and family, adding a social commerce element to the shopping experience.
• Initial image generation may take several seconds to process.
Key limitations: While the visual preview shows how shoes look aesthetically, users still need to pay attention to sizing since the AI only provides virtual imagery.
• The technology cannot account for fit, comfort, or actual shoe dimensions.
What’s next: Google is testing additional features and expanding availability across multiple regions.
• A dedicated try-on app called Doppl is in development, featuring 360-degree avatar spins to show outfits from all angles.
• Virtual try-ons will expand to Australia, Canada, and Japan in the coming weeks.
• The feature currently works with over one billion clothing items, according to Google.