Rating Apple Intelligence features from “Meh” to “That might be OK”
This article may contain personal views and opinion from the author.
We may earn a commission if you make a purchase from the links on this page.
In the history of Apple doing “Apple things” (spans about 20 years now), hijacking the name of a whole trend in tech probably beats them all. Of course, I am talking about AI — or as Apple likes to call it: Apple Intelligence.
We’re going to skip the debate over how much of what current AI is actually “Artificial Intelligence”. Back in 2018, some manufacturers tried to pull the “AI tech” card, but people spoke up, and the AI term was quickly dropped for things like Machine Learning and Deep Learning — which were accepted as more accurate.
Now, it’s back — it was a huge boom in late 2022, and everyone felt the pressure to jump on the trend. There are a plethora of chatbots, image generators, song generators even.
Google rushed through projects like Bard, Duet AI, LaMDA, until it finally streamlined the branding under the Gemini umbrella, saving us a lot of confusion along the way (well… there’s still Gemini nano, Pro, Ultra, Advanced, Business, Enterprise, but who’s counting). Samsung, more calmly, launched Galaxy AI.
When Apple proudly dubbed its AI to be “Apple Intelligence” at the start of WWDC 2024, everyone was at the edge of their seats. What could a company that’s known for putting a unique, super-polished, user-friendly spin on known tech do?
Some months have passed, and the answer thus far is “Nothing new, really”.
For one, most of the promised Apple Intelligence features and functions are not even launching with iOS 18. We should get some of them “within a month”, many “by the end of the year”. But OK, let’s not focus on this. The thing is that most of what was promised falls into the “Uhhh, OK” category, a few of the features prompt a quick “Oh yeah, we’ve seen that”, and even fewer are in the “Actually, that might be useful” camp.
Let’s take a deep look and rate them all!
It seems that every new AI platform nowadays offers you to rephrase your emails to match a certain tone. I’ll be honest, my initial reaction was “Who would want AI to change the way they express into this world?”. But, apparently, HR people have been reporting an avalanche of AI-generated cover letters with job applications over the past year.
Well… I stand corrected. My feeling towards having AI rephrase your email hasn’t changed, mind you, but apparently, there are all kinds of people out there.
Then, as Apple promises, its AI will also offer proofreading. Which… may be a bit more advanced version of autocorrect, but we’ve seen those tools. Like Grammarly, way before AI even became a trend. So, if anything, it’s not groundbreaking new “Apple Intelligence” — it’s a catch-up jump.
And, finally, there’s summarizing. Now, I am a proponent of AI summarizing and have happily been using the Galaxy S24 Ultra’s “summarize webpage” feature when someone sends me an article that’s so long that I’d never read it. In fact, it’s probably the only Galaxy AI feature I use. So… OK, Apple, that’s cool.
You know how iOS Photos sometimes creates “memory” collages for you, based on time, place, people in the photos? Yeah, that’s cool, and you will now be able to prompt the phone to do it for you by requesting a memory in free text form.
Thanks to AI, I guess, the phone will be able to understand that request and generate a new memory clip for you.
Truth be told, I am happy to review collages whenever they get served up to me via a notification, I’m not sure I’d ever go looking for one. My most memorable and happy trips are already organized in albums already, but OK. I’ll give this a 6/10 — nobody asked for, but may be nice to have once in a while.
You should also be able to look for specific photos via free text. Google Photos and Apple Photos already recognize faces, or you can go looking for “dog” or even specific breed if you wish. But, I suppose the point here is that you can go looking for “Jimmy blowing out the candles at his 6th birthday”, which is cool. It’s always good to have super-specific search capabilities.
We will also be able to search for specific moments within videos, like searching for “Jumping in the water” in a clip where you spend the first 15 minutes psyching yourself up for the jump. OK, I’m on board, depending on how well this works (can’t know now, it’s “coming next month”).
Lastly, we have Clean Up, which is a generative eraser that will allow you to remove photobombers or random objects in a photo. Like the Magic Eraser that Samsung and Google have had for a while now. Again — catching up here.
For the first time ever, you will be able to record a phone call on an iPhone. This feature has never been included, and wasn’t available via a 3rd party app — I always presumed it’s because the legality around doing so varies so much between countries and states that Apple didn’t even want the headache.
Now, you will be able to start a phone recording whenever on a phone call. Doing so will trigger a quick audio prompt, letting both parties know that the call is being recorded. After that, the recording can be found in your Notes app, with an AI-generate transcript of what was talked about, and what was decided.
This is also launching on Pixel Android right now, and sounds like a really useful feature. Instead of having to shuffle around to write down a number, a date, or plans that are being talked about — AI should have them ready for you as soon as you hang up that call.
You will be able to record speech within the Notes app, too, with a transcription being generated automatically soon after. Kind of how it works on Galaxy phones right now.
So, nothing new, but Apple is keeping step.
Siri is getting smarter! Apple’s voice assistant should be able to understand complex questions, even if you happen to stutter or repeat some words, which often happens to me when I activate that assistant button. It will also be able to follow context through follow-up requests, which is a promise that we’ve heard a few times over the past 8 years or so.
An interesting new promise is that Siri will now have a full knowledgebase of Apple device features. So, you should be able to ask it “How do I do this on my Mac? How do I do this on my iPhone”, and get a concise how-to as an answer. Cool, if it works out!
For further down the line — months from now — Siri is promised to get awareness of what you are looking at on-screen, and your personal context from your Apple account and activity. Presumably, this means that, if you are currently looking at an event poster, you’d be able to ask Siri to “Add that to my calendar”. Or “Generate birthday boy caricature of my brother”, this one employing the Image Playground app, which is also coming later.
Just like Google launched Pixel Studio with the new Pixel 9 series (and it’s still not complete, not being able to generate human figures), Apple will be launching Image Playground in the following months. Like Dall-E, and Stable Diffusion, and the many different engines you can fond online — it will generate a picture based on your prompt.
Now, with Apple having Image Playground baked into iOS, and pulling from your on-screen context, and user information, it has the freedom to streamline the process and make it quicker and easier. Like the example above, where you’d simply be able to ask Siri to generate a cartoon image of someone from your Contacts / Photos.
Image Playground will also work within the Notes app — through a new Image Wand tool. You can circle a rough sketch and have AI re-draw it into a complete image. Or, if you have a long note going and want something visual in it, you can circle an empty area and have AI pull context from the text to figure out what to draw.
Lastly, there will be the new Genmoji. If the 3,500+ emoji aren’t enough, you can prompt the phone to generate a custom emoji (Gen-moji, get it?) you could use in your text. I don’t see this one exactly working. People like expressing themselves with emoji precisely because they are a bit limited, and figuring out how to combine a set of emoji is the fun in it. Over the years, specific emoji have become memes or widely understood meaning exact specific things — kind of like a “hidden message” or “inside joke”. Generating a custom one kills the joke, and begs the question — how’s that different from just generating an image?
There will be a new Reduce interruptions Focus mode to choose from. This one will leverage AI to figure out if an incoming text or mail is time-sensitive, and ping you if it is. Otherwise, it will function as DND — that’s pretty cool.
But, more importantly — notifications will now be much more useful. Instead of getting the first few lines of text from a message or email, iOS will actually serve you a brief summary of what that message holds. Now that’s an upgrade and I stand behind it!
OK, let’s tally up the score. After going through all the features that were promised, presented, and shown off, here’s how many times I thought to myself:
So, either I am an old man yelling at iClouds, or the forthcoming Apple Intelligence is nothing we haven’t seen before, or nothing too exciting. I am certainly not of the mind that it warranted its own naming scheme to conveniently align with the AI abbreviation.
Certainly, you will benefit from it being baked into iOS from the get-go — things like Image Playground working directly into Notes sound like time-savers. But, for the time being, you can very much see and use these features elsewhere. Actually, for the time being — you probably are doing exactly that.
It doesn’t do Apple any favors that it had to softly admit defeat by also promising a ChatGPT integration with Siri further down the line. I guess it will also be baked into the iOS core apps in order for if to make any sense as a feature. As it is right now, you can simply download the ChatGPT app or go to the website to interact with the AI. Again… meh?
We’re going to skip the debate over how much of what current AI is actually “Artificial Intelligence”. Back in 2018, some manufacturers tried to pull the “AI tech” card, but people spoke up, and the AI term was quickly dropped for things like Machine Learning and Deep Learning — which were accepted as more accurate.
Google rushed through projects like Bard, Duet AI, LaMDA, until it finally streamlined the branding under the Gemini umbrella, saving us a lot of confusion along the way (well… there’s still Gemini nano, Pro, Ultra, Advanced, Business, Enterprise, but who’s counting). Samsung, more calmly, launched Galaxy AI.
But in the end, the excitement is tapering off. A lot of these tools do the same things — they are extremely smart with text and summarizing, OK with images, meme-worthy with music (for now).
And then, Apple joined in on the fun
When Apple proudly dubbed its AI to be “Apple Intelligence” at the start of WWDC 2024, everyone was at the edge of their seats. What could a company that’s known for putting a unique, super-polished, user-friendly spin on known tech do?
Some months have passed, and the answer thus far is “Nothing new, really”.
For one, most of the promised Apple Intelligence features and functions are not even launching with iOS 18. We should get some of them “within a month”, many “by the end of the year”. But OK, let’s not focus on this. The thing is that most of what was promised falls into the “Uhhh, OK” category, a few of the features prompt a quick “Oh yeah, we’ve seen that”, and even fewer are in the “Actually, that might be useful” camp.
Let’s take a deep look and rate them all!
Writing tools — what’s new?
- Rephrasing texts and emails - Seen it
- Proofreading - Been there
- Text summary - Ah, OK
It seems that every new AI platform nowadays offers you to rephrase your emails to match a certain tone. I’ll be honest, my initial reaction was “Who would want AI to change the way they express into this world?”. But, apparently, HR people have been reporting an avalanche of AI-generated cover letters with job applications over the past year.
Then, as Apple promises, its AI will also offer proofreading. Which… may be a bit more advanced version of autocorrect, but we’ve seen those tools. Like Grammarly, way before AI even became a trend. So, if anything, it’s not groundbreaking new “Apple Intelligence” — it’s a catch-up jump.
And, finally, there’s summarizing. Now, I am a proponent of AI summarizing and have happily been using the Galaxy S24 Ultra’s “summarize webpage” feature when someone sends me an article that’s so long that I’d never read it. In fact, it’s probably the only Galaxy AI feature I use. So… OK, Apple, that’s cool.
Photos
- Generate Memories with natural text prompt - OK
- Search for photos via natural text prompt - OK
- Search for exact moments in clips - OK
- Clean Up (a.k.a. Magic Eraser) - Seen it
You know how iOS Photos sometimes creates “memory” collages for you, based on time, place, people in the photos? Yeah, that’s cool, and you will now be able to prompt the phone to do it for you by requesting a memory in free text form.
Thanks to AI, I guess, the phone will be able to understand that request and generate a new memory clip for you.
Truth be told, I am happy to review collages whenever they get served up to me via a notification, I’m not sure I’d ever go looking for one. My most memorable and happy trips are already organized in albums already, but OK. I’ll give this a 6/10 — nobody asked for, but may be nice to have once in a while.
We will also be able to search for specific moments within videos, like searching for “Jumping in the water” in a clip where you spend the first 15 minutes psyching yourself up for the jump. OK, I’m on board, depending on how well this works (can’t know now, it’s “coming next month”).
Lastly, we have Clean Up, which is a generative eraser that will allow you to remove photobombers or random objects in a photo. Like the Magic Eraser that Samsung and Google have had for a while now. Again — catching up here.
Record, transcribe, summarize audio
- Record calls and audio - New for Apple
- Automatic summary of recordings in Notes app - Oh, cool!
For the first time ever, you will be able to record a phone call on an iPhone. This feature has never been included, and wasn’t available via a 3rd party app — I always presumed it’s because the legality around doing so varies so much between countries and states that Apple didn’t even want the headache.
Now, you will be able to start a phone recording whenever on a phone call. Doing so will trigger a quick audio prompt, letting both parties know that the call is being recorded. After that, the recording can be found in your Notes app, with an AI-generate transcript of what was talked about, and what was decided.
This is also launching on Pixel Android right now, and sounds like a really useful feature. Instead of having to shuffle around to write down a number, a date, or plans that are being talked about — AI should have them ready for you as soon as you hang up that call.
You will be able to record speech within the Notes app, too, with a transcription being generated automatically soon after. Kind of how it works on Galaxy phones right now.
So, nothing new, but Apple is keeping step.
Siri
- Understanding complex questions and requests - I would hope
- Follows context and follow-ups - I would doubly-hope
- Can generate how-tos about specific iPhone and Mac features - OK
- Awareness of what’s on-screen and user-specific context - Seen it
- Take action within Apple apps and 3rd party apps - OK
Siri is getting smarter! Apple’s voice assistant should be able to understand complex questions, even if you happen to stutter or repeat some words, which often happens to me when I activate that assistant button. It will also be able to follow context through follow-up requests, which is a promise that we’ve heard a few times over the past 8 years or so.
Siri on my iPad - right now
An interesting new promise is that Siri will now have a full knowledgebase of Apple device features. So, you should be able to ask it “How do I do this on my Mac? How do I do this on my iPhone”, and get a concise how-to as an answer. Cool, if it works out!
Image Generation
- Type a request in Image Playground to generate picture - Seen it
- Transform rough sketch into an AI image - Seen it
- Generate an AI image into empty space in Notes, based on context - Oh, cool!
- Genmoji - But why, though?
Just like Google launched Pixel Studio with the new Pixel 9 series (and it’s still not complete, not being able to generate human figures), Apple will be launching Image Playground in the following months. Like Dall-E, and Stable Diffusion, and the many different engines you can fond online — it will generate a picture based on your prompt.
Now, with Apple having Image Playground baked into iOS, and pulling from your on-screen context, and user information, it has the freedom to streamline the process and make it quicker and easier. Like the example above, where you’d simply be able to ask Siri to generate a cartoon image of someone from your Contacts / Photos.
Image Playground will also work within the Notes app — through a new Image Wand tool. You can circle a rough sketch and have AI re-draw it into a complete image. Or, if you have a long note going and want something visual in it, you can circle an empty area and have AI pull context from the text to figure out what to draw.
Lastly, there will be the new Genmoji. If the 3,500+ emoji aren’t enough, you can prompt the phone to generate a custom emoji (Gen-moji, get it?) you could use in your text. I don’t see this one exactly working. People like expressing themselves with emoji precisely because they are a bit limited, and figuring out how to combine a set of emoji is the fun in it. Over the years, specific emoji have become memes or widely understood meaning exact specific things — kind of like a “hidden message” or “inside joke”. Generating a custom one kills the joke, and begs the question — how’s that different from just generating an image?
Notifications upgrade!
- Reduce interruptions mode will only deliver time-sensitive notifications - OK, cool
- Messages and email previews will show summary - An actual upgrade!
There will be a new Reduce interruptions Focus mode to choose from. This one will leverage AI to figure out if an incoming text or mail is time-sensitive, and ping you if it is. Otherwise, it will function as DND — that’s pretty cool.
But, more importantly — notifications will now be much more useful. Instead of getting the first few lines of text from a message or email, iOS will actually serve you a brief summary of what that message holds. Now that’s an upgrade and I stand behind it!
Conclusion
OK, let’s tally up the score. After going through all the features that were promised, presented, and shown off, here’s how many times I thought to myself:
- I hope this actually works - 2
- Meh - 7
- That might be OK - 6
- Hey, that’s cool - 3
- That’s actually good - 1
So, either I am an old man yelling at iClouds, or the forthcoming Apple Intelligence is nothing we haven’t seen before, or nothing too exciting. I am certainly not of the mind that it warranted its own naming scheme to conveniently align with the AI abbreviation.
Certainly, you will benefit from it being baked into iOS from the get-go — things like Image Playground working directly into Notes sound like time-savers. But, for the time being, you can very much see and use these features elsewhere. Actually, for the time being — you probably are doing exactly that.
Things that are NOT allowed: