One of Apple’s best AI features gets a big boost in iOS 26.
Visual Intelligence is one of the few AI-powered feature of iOS 18 that we regularly make use of. Just hold down the Camera button on your iPhone 16 (or trigger it with Control Center on an iPhone 15 Pro), point your phone at something, and hit the button.
If it’s a sign in a foreign language, you can translate it. If there’s a phone number, call it in one tap. Address? Add it to your contacts or navigate to it. A business may pull up hours and contact info, or even a menu if it’s a restaurant. And of course, it can identify all sorts of plants and animals, landmarks, famous artwork, and more. If Apple’s built-in AI doesn’t know enough, you can tap the Ask button to ask ChatGPT.
Once you remember it’s there, you’ll find yourself using it all the time. It’s genuinely useful, but it has one serious limitation—in iOS 18, it’s limited to what your phone’s camera can see.
With iOS 26, Apple fixes that, by building Visual Intelligence into the screenshot interface. Now you can use the same AI-powered features on any screenshot, from any app.
Домой
United States
USA — software How to use Visual Intelligence to analyze any screenshot in iOS 26