Google is introducing new search gestures for iPhone users, making it easier to search for anything on their screen. With this feature, users can highlight, draw around, or tap on text, images, or videos to search instantly using Google Lens. This allows iPhone users to quickly find information, shop for products, identify objects, or look up words and phrases with a simple gesture.
It works directly in the Google app and Chrome browser without needing to take a screenshot or open a new tab. So like Android’s Circle to Search (but not exactly the same), iPhone users can now search visual content more easily.
Seems like Google is pushing its multimodal search capabilities further on iPhones, closing the gap with Android features. It reflects a broader strategy to make Google Search an integral part of everyday mobile experiences, regardless of platform.
This move positions Google as a major player in search innovation on iPhones, offering an alternative to Apple’s built-in visual lookup and Siri. Talking about availability, starting this week, iPhone users around the world can use the ‘Search Screen with Google Lens’ feature through the ‘three-dot menu’ in both the Google app and Chrome browser. Additionally, in the coming months, iPhone users will also see a new Lens icon in the address bar of Chrome, making it easier to access the same.
“Whether you’re reading an article, shopping for a product or watching a video, you can use this feature to quickly perform a visual search while browsing, without having to take a screenshot or open a new tab,’ the company said in its blog post.
Notably, last year in January 2024, the tech titan unveiled the Gemini-powered ‘Circle to Search’ feature for the Pixel 8 and Samsung Galaxy S24 series initially.
At the same time, Google is also making Lens even smarter with advanced AI models that can understand and explain more unique or unusual images. While Lens has always been able to identify common objects like plants, products, and landmarks by matching them to images from the web, it can now provide deeper insights on things that may not have clear matches.
This means if users see something unfamiliar—like a car with a strange hood texture—they can snap a photo using Lens and get an AI Overview explaining what it might be, along with useful links for further details. With this update, AI Overviews will now appear more frequently in Lens search results, even if you don’t type a question. This feature is rolling out this week for English-language users in countries where AI Overviews are available.