Apple’s September press event focused heavily on the new iPhone 16, including a new online search feature reminiscent of Google’s Google Lens AI offerings. Apple announced Apple Visual Intelligence, a feature utilizing the iPhone camera that lets users automatically search online using the content of their photos. If this sounds familiar, it’s because nearly identical functionality has been present in Google’s Pixel phones. Google’s now-ubiquitous “circle search” allows users to isolate parts of images (most commonly clothing items and consumer goods) and find those items online via a visual search.
Where Apple’s Visual Intelligence differs is its native integration into the iPhone camera and its partnership with third-party search providers. While Google Lens keeps users within its own platforms, primarily Google’s Merchant Center, Apple’s demonstration showcased providers such as OpenAI’s ChatGPT and Yelp. This allows Apple to compensate for its lack of an online sales platform and its freshly relaunched Apple Maps by navigating users to directories that can facilitate these searches.
This means Google’s shopping center will have another influx of users performing photo-based searches, while Yelp will likely gain more restaurant traffic as Apple users take photos of signage. Business managers must optimize for these AI image searches to ensure their products appear when Apple customers search visually. It’s expected that once Apple Maps’ relaunch is fully underway, Apple will steer its Visual Intelligence back toward its own platforms. Until then, business managers need to optimize listings across all directories while ensuring their Apple listings are ready for when the company pivots toward its internal offerings.