Unlocking Visual Intelligence on iPhone with iOS 18.2
Discover how to leverage Visual Intelligence on your iPhone 16 with iOS 18.2, Apple’s answer to Google Lens, offering a range of functionalities from object identification to text translation.

Although the iPhone 16 series debuted in September with iOS 18, it wasn’t until iOS 18.1 and later iOS 18.2 that Apple started rolling out its AI features, including Visual Intelligence for the iPhone 16. This feature, exclusive to the iPhone 16 models, requires the new Camera Control feature and iOS 18.2.
Visual Intelligence operates similarly to Google Lens, allowing users to point their camera at an object and use ChatGPT to identify it or perform a Google Search based on what the camera sees. The functionality extends to various actions with text, such as summarization, translation, and even reading aloud.
To activate Visual Intelligence, ensure Apple Intelligence is enabled in your iPhone 16 settings, join any necessary waitlists, and download required data. Once set up, simply press and hold the Camera Control button, point your camera at the subject, and select ‘Ask’ or ‘Search’ to explore its features.
This tool is versatile, useful for identifying objects, obtaining business information, placing orders, and managing text-based tasks. Unfortunately, due to the Camera Control requirement, this feature is currently exclusive to the iPhone 16 series, making it a unique selling point for these devices.