
Those who believed Apple was somehow “behind” on AI were always missing the point. Apple is well ahead of the AI market leaders in one key way, and it’s the one piece that’s missing elsewhere: Trust.
Apple’s WWDC 2024 keynote was a whopper, with improvements across its vast ecosystem of hardware, software, and services. Some of it–like the long-awaited ability to put icons anywhere on-screen in iOS–were rumored in advance, while others were welcome surprises. But nothing Apple said or is doing matters more than the way it is building on the trust its customers already have in this company and its offerings.
The umbrella term for this work is Apple Intelligence, a wink-wink take on AI. It’s described as a personal intelligence system that puts powerful generative models at the core of the iPhone, iPad, and Mac. Which, on the surface, sounds a lot like what Google is doing with Gemini, Android, and ChromeOS, or what Microsoft is doing with Copilot, Windows, and Microsoft 365. But the differences between AI–artificial intelligence–and Apple Intelligence are stark.
Aping the “people at the center” marketing that Microsoft brings back every few years, Apple Intelligence addresses a key issue with existing AI chat tools like ChatGPT or Copilot, which use “world knowledge” but know very little about the user. It “draws on your personal context” that’s helpful and relevant to you. It …
Wait a second.
That sounds an awful lot like Microsoft Recall, doesn’t it? That unfairly maligned feature in Copilot+ PCs that was never a privacy issue but for one thing: It’s made by Microsoft, which no one trusts. And Microsoft never explained it correctly, despite the obvious need … until it did. The changes Microsoft made to Recall were minor, and mostly addressed privacy-related dark patterns and not the fake security issues some so-called security researchers raised. But the damage had been done. No one trusts Microsoft. And Recall, as a result, is tarnished.
So what about Apple Intelligence? Aside from the obvious–more people trust Apple than trust Microsoft–what really differentiates what Apple is doing from what Microsoft is doing?
A couple of things. Clear communication, always key, and an Apple strength. And, as important, it’s not taking its relationship with its customers for granted, as Microsoft does. It knows that trust is tenuous and can be destroyed at any time. And so it is being explicit about why customers should trust Apple Intelligence.
Forget about the capabilities. Yes, Apple Intelligence is deeply integrated into the Apple experience across the devices and apps you use every day. Yes, there are on-device language models that make it all work. But all that is obvious. What matters is trust.
When you make a request, Apple Intelligence determines whether it can be processed on-device. If it can, it is, and what happens on your iPhone (or whatever device) stays on your iPhone. But if it needs greater computational capacity, it “draws on private cloud compute,” meaning Apple Silicon servers in Apple’s datacenters. Your data is never stored on those servers. It’s never made accessible to Apple. It’s used only to fulfill the request. And independent experts can inspect the code running on Apple’s servers to verify Apple’s privacy promise.
In other words, Apple isn’t content to believe that you will trust it. It will verify that your trust is not misplaced. Trust and verify.
“Private cloud compute cryptographically ensures that your iPhone, iPad, and Mac will refuse to talk to a server unless its software has been publicly logged for inspection,” Apple’s Craig Federighi said during the event. “This sets a brand-new standard for privacy in AI, and [it] unlocks intelligence you can trust.”
I know there are those who will never choose Apple for various reasons. And those who will try to pick apart this message. But we live in a world in which OpenAI is stealing content to get ahead as Big Tech races forward, damn the torpedoes, to stuff AI down our collective throats. And this is what’s been missing. Not empty words about “responsible AI,” but real, meaningful steps to bring trust, verifiable trust, into this equation.
Cynically, yes, this will require new hardware. So, like Copilot+ PC, it’s an attempt to shift boxes to a user base who had slowed down their respective upgrade cycles. But where Copilot+ PC is a tough sell for the PC market, Apple Intelligence will find an eager audience ready to upgrade for these advances.
And that’s true even though Apple, like Microsoft, is partnering with OpenAI.
“Privacy protections are built in for users who access ChatGPT — their IP addresses are obscured, and OpenAI won’t store requests,” Apple notes. “ChatGPT’s data-use policies apply for users who choose to connect their [OpenAI] account.” Right. Apple will protect you from those monsters, too.
This may be tough on the Microsoft loyalists. There is something unfair here, a feeling that Apple could have literally introduced something identical to Recall and its user base would have embraced it wholeheartedly. But this is what happens when you spend a decade building trust. It’s the opposite of what Microsoft has done with Windows 11, in particular.
There’s a lot more detail to explore here, and we’ll do that. For now, I want you to consider how starkly different the world looks today. Apple, which was allegedly “behind” in AI, a position I never felt good about, is suddenly showing the rest of the market how it’s done. In its rush to bring AI to the world, Microsoft has behaved irresponsibly. And it may pay a heavy price for that: Where Microsoft’s approach almost seems drunken and ill-conceived, the future Apple shows feels safer and less haphazard.
It is, as Apple says, “AI for the rest of us,” a welcome call-back to one of its earliest tag-lines. Nicely done.
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.