Amazon has announced the launch of Lens Live, a new feature in its Shopping app designed to help customers find products quickly using their smartphone cameras. The company says that tens of millions of U.S. customers on iOS now have access to the tool, with plans to expand availability in the coming weeks.
Lens Live allows users to scan items they see in real life or on social media and receive instant matches from Amazon’s catalog in a swipeable carousel at the bottom of their screen. Customers can tap an item within the camera view to focus on it, add products directly to their cart, or save them to wish lists without leaving the camera interface.
The update also integrates Rufus, Amazon’s AI-powered shopping assistant. According to Amazon, “To help customers using Lens Live learn more about products they’re viewing, we’ve integrated our AI shopping assistant, Rufus, into the experience. While in the camera view, customers will now see suggested questions and quick summaries of what makes a product stand out. These conversational prompts and summaries appear under the product carousel, allowing customers to perform speedy research, quickly access key product insights, and get their questions answered.”
Amazon notes that Lens Live is powered by AWS-managed OpenSearch and SageMaker services for machine learning deployment at scale. The system uses computer vision models running on-device for real-time object detection as users pan across scenes or focus on specific items. A deep learning visual embedding model matches customer views against billions of products available on Amazon.
Customers who prefer traditional methods can still use existing features like taking pictures, uploading images, or scanning barcodes within Amazon Lens.
Lens Live follows recent efforts by Amazon to make shopping more informed and interactive for users through generative AI technology such as Rufus, which is now available for all U.S. customers both in-app and on desktop.
