Tim Cook says that this is the most popular Apple Intelligence feature on iPhone

The Apple Intelligence feature that is the most popular on iPhone is one that really like too.

0comments
iPhone models showing various shots of Apple Intelligence on their screens.
After Apple reported a very good fiscal first quarter of 2026, CEO Tim Cook hosted the earnings call, which revealed the Apple Intelligence feature that he says is the most popular among iPhone users. I suspect that this is going to change once the revised version of Sir is released in iOS 26.4. But for now, Tim Cook says that Visual Intelligence is the most popular Apple Intelligence feature on the iPhone. In some ways, this is an AI-based feature that is similar to Google Lens.l

Where to find Visual Intelligence and which iPhone models have it


This feature is found in the Camera Control feature on the iPhone 16 and iPhone 17 lines (including the iPhone Air). On some recent models without the Camera Control feature, like the iPhone 15 Pro, iPhone 15 Pro Max, and the iPhone 16e, Visual Intelligence can be assigned to an Action Button or a Control Center panel. With the feature you can look at a compatible restaurant through the camera on your iPhone and with Visual Intelligence you can see operating hours and even a menu. Text can be translated into a different language, and text can be summarized or read out loud. You can also use Visual Intelligence to identify landmarks, animals, and plants.

Do you have a favorite Apple Intelligence feature on iPhone?

Cook said that the adoption of Apple Intelligence has been strong since Apple launched its AI initiative in 2024. While iPhone users with Apple Intelligence can summarize emails and websites, block certain content on webpages, write using certain tones, create images, have a conversation translated in real time, create Emoji, create images in the style of Anime, Oil Paintings, and Watercolors with the Image Playground, and more.

Delays adding AI features to Siri made Apple Intelligence a fiasco for Apple


Despite what seems like quite a few AI based features, the failure to have Siri on board as a chatbot (that won't happen until iOS 27) is one of the reasons why Apple Intelligence feels like a failure. However, in iOS 26.4, Siri becomes a Large Language Model (LLM) allowing the assistant to hold more natural and complex conversations. In iOS 26.4, Personal Siri kicks in. A question such as "What time did my brother say he was arriving?" can be answered by Siri as the latter goes through your texts, photos, emails, and files looking for the answer.

Recommended For You


In iOS 26.4, you'll be able to tell Siri, "Edit this photo and send it to Joe Mama" or "Add this address from my messages to Joe's contact card." If you want a powerful digital assistant answering questions in-depth, you'll have to wait for Siri in iOS 27 later this year. Or you can buy a Pixel and use Gemini as your assistant now.

Try Noble Mobile for only $10

Get unlimited talk, text, & data on the T-Mobile 5G Network plus earn cash back for data you don’t use.
Buy at Noble Moblie
Google News Follow
Follow us on Google News

Recommended For You

COMMENTS (0)
FCC OKs Cingular\'s purchase of AT&T Wireless