Summary
- Apple’s Visual Intelligence feature uses the camera to identify objects and answer queries
- It can provide information on plants, animals and business locations, and action options
- Depending on the make of iPhone you have, you may need to update your iOS, and customise the action button, to access this feature
- Text can be summarised, translated or read aloud, and actions such as calling a phone number can be initiated
- The feature operates like Google Lens
- Options include Ask and Search, which can generate prompts or Google search results
- Swiping up on the screen will exit Visual Intelligence
By David Nield
Original Article