What you need to know about Apple Intelligence
Apple’s big AI leap is going to change the way you use your Apple devices in profound ways.
Going into Monday’s WWDC keynote, everyone was expecting Apple to introduce new AI tools into its operating systems. A lot of the broad strokes of it leaked ahead of time. Even so, I was blown away by just how much thought Apple has put into its implementation of generative AI, and just how expansive it is. And while Apple did announce a partnership with OpenAI for ChatGPT implementation, most of what they had to announce was based on Apple’s own AI models.
Branding
Love them or hate them, Apple always creates marketing names for the various core parts of its hardware and software so that hopefully you’ll remember them. For its new AI tools, Apple has chosen the name “Apple Intelligence,” which Apple says is “AI for the rest of us.” I think this is smart for two reasons: First, AI sounds kind of sinister to many people. We think of the Hal 9000 computer from 2001 which began killing the ship’s human crew members because it thought that was the best way of fulfilling its mission. Or Skynet from the Terminator films, which began trying to wipe out or enslave every human on Earth because it perceived them to be a threat to its existence. Apple Intelligence is meant to be less sterile. Plus, it emphasizes that this is from Apple, whose users rate its products with a very high level of customer satisfaction.
Security and Privacy
I actually think this is worth covering even before we get to the features because I've already seen loads of false information bandied about online in an attempt to make people afraid of Apple Intelligence. And Apple understands that privacy is a huge concern with AI, so they dedicated a considerable amount of time to discussing how Apple's approach to AI is rooted in deep user privacy. First, Apple has always prioritized on-device computing where possible, and this hasn't changed with Apple Intelligence. As much as it can, Apple Intelligence uses AI models that live and run on your iPhone, iPad, or Mac, so they never have to talk to an off-device server.
But not all models are lightweight enough to be run from an iPhone. Some absolutely require the power of massive data centers. And Apple announced that they've been building server infrastructure running on Apple silicon chips like ones in your iPhone and Mac. Here’s what Apple says about it’s “Private Cloud Compute” method of handling cloud-based AI:
To run more complex requests that require more processing power, Private Cloud Compute extends the privacy and security of Apple devices into the cloud to unlock even more intelligence.
With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon, providing a foundation that allows Apple to ensure that data is never retained or exposed.
Independent experts can inspect the code that runs on Apple silicon servers to verify privacy, and Private Cloud Compute cryptographically ensures that iPhone, iPad, and Mac do not talk to a server unless its software has been publicly logged for inspection. Apple Intelligence with Private Cloud Compute sets a new standard for privacy in AI, unlocking intelligence users can trust.
When Private Cloud Compute is needed, your iPhone, iPad, or Mac only sends the specific data it needs to Apple’s servers. And crucially:
Your data is never stored or made accessible to Apple, it's used exclusively to fulfill your request…
I’m not saying you should grant complete trust to any Big Tech company when it comes to AI, not even Apple. I am saying the company that publicly stood up to the FBI and refused to create a back door into iOS is the Silicon Valley company most deserving of your trust. Apple has consistently touted its focus on privacy for many, many years now.
Ok, but what can Apple Intelligence do?
It’s Not An App
We’ll get to Siri in a minute, but Apple Intelligence isn’t an app. It exists pretty much everywhere on your Apple devices in every core Apple app, and coming soon to third-party apps. Mail will prioritize and summarize your emails. When you’re writing an email, you can have it rephrase what you’ve written to sound more professional or more friendly. It can prioritize the notifications on your Lock Screen so that the most important notifications get surfaced above less important ones. It can proofread your writing in Mail, Notes, or Pages. (As someone who writes every single one of these columns in Apple Notes, I can’t wait for it to be able to proofread and edit my writing before I publish embarrassing typos (which I almost always do).
Like other AI tools, it can generate images for you. One example they showed was having Apple Intelligence create an image of the very person you’re texting in Messages. That’s actually really cool. I’m sure there’s existing AI tools that do this, but your basic image generation tools don’t have the context to know your wife’s name and face or your friend’s name and face and therefore can’t create artwork based on their likeness. This can. And while image generation is built right into Messages and other apps, there’s a brand new Image Playground app for building these bespoke images in various styles.
Genmoji
I know some of you are going to roll your eyes at this one, but this is going to be a hugely popular tool. You can generate your own emoji based on custom prompts. Ever have that problem where you’re digging through the scores of built-in emoji to find the perfect one for a specific situation only to be frustrated after wasting far too much time on the search? Now you can describe to your iPhone what kind of emoji you want, and Genmoji will build it for you. Even ones based on the likeness of friends and loved ones.
Advanced Searches
Apple Intelligence will know everything that’s on your device. So if you can’t remember when and where you’re meeting a friend from church for lunch, you can ask Siri to remind you, and it’ll pull up all the details from your texts and email. It’s also going to make searching in Photos much more powerful. There are a lot of times now where I’m looking for a specific photo, but Photos isn’t smart enough for me to just describe it in plain language, so I have to look for it like a needle in a haystack. Now it’s smart enough to parse your plain text description and pull up the image or images that match.
Siri
Speaking of Siri, this is the big update we’ve all been waiting for. No more situations where Siri just badly misinterprets you. Even if you pause or stumble over your words when trying to talk to Siri, she’ll understand what you mean and get you the relevant information you want. It’ll also be able to perform scores of tasks in the individual apps on your phone.
ChatGPT
Before we talk about how ChatGPT works with Apple Intelligence, remember that everything I just described is done with Apple’s own AI models and tools. None of the above requires ChatGPT at all. But you might ask Siri something that it doesn’t have the tools to answer. In those situations, it’ll ask you if you want to send your request to ChatGPT, and you’ll have to grant explicit permissions every time it wants to do that. That’s another way that Apple is protecting your privacy. You won’t need a ChatGPT account, and it’ll all be free to use. Apple says it eventually wants to add additional third-party AI tools and models, but they wanted to start with what is generally considered to be the best in the industry.
Whew. That’s a lot. And it’s not remotely everything Apple Intelligence can do. Personally, I can’t wait to start playing with this as soon as Apple makes it available. Although at some point you’ll be able to test these tools in a public beta, these features won’t officially ship until later this year.
Super important note: This will work on any Mac or iPad with an M-series chip. But on the iPhone, you have to have an iPhone 15 Pro or iPhone 15 Pro Max or newer. That’s going to be a hard pill to swallow for the millions of iPhone users who have a two year old iPhone or older.
It was the first WWDC keynote that really held my attention throughout and the first time in a while I've been genuinely excited for new software features. As someone who's switched back and forth between iPhone and Android several times over the past ten years, I both LOVE and roll my eyes at the home screen customizations Apple is FINALLY offering. BUT, it looks like they've done it in a beautiful way that will be easier for the masses to understand and use. Similarly, I'm thrilled to be able to use my iPhone virtually from my Mac, which is something I was able to do on Samsung Galaxy phones years ago. This is something I need to do often throughout the work day, and it'll be nice to be able to leave my phone on the charger with StandBy and still access my smart home apps.
For AI, I have had little-to-no interest in AI until the past month or two. Again, Apple's implementation looks like a great first step. It is irritating that it won't work on my iPhone 14 Pro Max, but I guess I shouldn't be that surprised. I hadn't planned on being interested in the new phones this year, but the AI tools may be worth an upgrade (though it'll mean giving up my lovely and beloved Deep Purple phone...