Most understand that on-device AI requires a powerful NPU to function properly, but there are other requirements that get less attention, like RAM and storage. And we must face the understandable reality of upsell: Some AI features will be limited to new devices only so that hardware makers can benefit from an exaggerated upgrade cycle.
Will it work?
The non-NPU hardware requirements first came to light when Google brought its Gemini Nano on-device small language model (SML) to the Pixel 8 Pro in late 2023. It explains why Copilot+ PCs have a 16 GB RAM minimum, compared to 4 GB for other Windows 11 PCs, and a 256 GB of (non-HDD) storage minimum, compared to 64 GB (which can be HDD). And now we're seeing this issue again with Apple Intelligence, which will be backported to the iPhone 15 Pro series but not the non-Pro iPhone 15s.
On its Apple Intelligence page, Apple explains that its hardware-accelerated, hybrid AI system will be made available on all Apple Silicon (M1, M2, M3, M4 series) Macs and iPads, and on the iPhone 15 Pro and Pro Max. But those are the only two iPhones supported: The iPhone 15 and iPhone 15 Plus don't make the cut-off.
This isn't artificial, like the Windows 11 hardware requirements. The iPhone 15s fall short in two key areas for on-device AI.
Yes, the first one is the NPU. The base iPhone 15s have a lesser Apple Silicon processor—an A16 Bionic vs. the Pro's A17 Pro—with a far less powerful NPU that delivers just 17 TOPS of hardware accelerated AI performance. The A17 Pro is twice as fast, at 35 TOPS.
But it's not just the NPU. The iPhone 15s also don't have enough RAM to handle on-device AI: 6 GB vs. the 8 GB in the Pro models. Thanks to AI, phone makers will be installing a lot more RAM in their devices than was the case until now.
Storage might also be an issue—the base iPhone 15s can be had with as little as 128 GB of storage compared to the 256 GB minimum on the Pros—but it seems like Apple will use only a handful of on-device models, compared to one on Pixel (see below) and over 40 (!) on Microsoft's Copilot+ PCs. It's likely that the processor/NPU and RAM differences are the bigger issue.
And I believe that because of what happened to Google.
When Google announced its Pixel 8 family of phones in late 2023, it heavily promoted a wide range of AI capabilities, as it did with previous Pixels. But that December, the online giant announced its Gemini family of AI models, and that the smallest of those models, Gemini Nano, would be used on-device on the Pixel 8 Pro, thanks to a Pixel Feature Drop.
Though this was the first time a phone maker put a modern SLM on a phone, Gemini Nano is still used for just two AI-accelerated features, as was the case at its launch.
"As the first smartphone engineered for Gemini Nano, the Pixel 8 Pro uses the power of Google Tensor G3 to deliver two expanded features: Summarize in Recorder and Smart Reply in Gboard," Google explained. "Gemini Na...
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.