
Copilot+ PCs differentiate themselves from other PCs in just one way: They have a more stringent set of hardware requirements than other Windows 11 PCs that includes, most notably, a sufficiently powerful NPU (neural processing unit) that enables unique on-device AI capabilities.
And that’s a problem. For multiple reasons.
Most obviously, Copilot+ PC bifurcates the market into haves and have-nots. It does so outside of the normal SKU, or product edition, system in which consumers typically have access to Home and Pro versions of Windows, the latter with additional capabilities. But you can buy Copilot+ PCs running Windows 11 Home or Pro, further confusing matters for customers.
It’s also problematic because the on-device AI capabilities that Copilot+ PC provides are mostly lackluster. This was especially true at launch one year ago, but it’s also true today, despite a few cool features like Click to Do. But this is a needlessly granular set of functionality that’s nice to have but not crucial. It’s further unclear whether Click to Do would work without an NPU. Is this requirement artificial?
The biggest and most obvious problem, however, is that the Copilot+ PC specification ignores the powerful GPUs and APUs that many already have in gaming PCs, workstations, and other PCs. Dedicated GPUs from Nvidia and other companies offer dramatically better performance than even the fastest NPU–often 10 times more TOPs, or more–and yet having one doesn’t give you any Copilot+ PC features.
The customers who spent lots of money on dedicated GPUs, whether they’re gamers, scientists, developers, or whatever else, are Microsoft’s best (Windows) customers. The on-device AI capabilities provided only to Copilot+ PC buyers today could easily be handled by these PCs. And, most likely, by many modern Windows PCs of whatever type. But Microsoft is artificially limiting those features. For reasons.
This creates a chicken and egg problem that guarantees that the PC industry cannot and will not move forward quickly enough to embrace the AI wave adequately to help any interested parties, which includes PC makers, silicon vendors, developers, and customers. And that makes one wonder why Microsoft even went with with this strategy in the first place.
As we’ve discussed, all PC buyers will eventually end up with Copilot+ PC functionality. That can happen in the short term by buying a Copilot+ PC, of course. But it will happen in the long term because every PC, eventually, will be a Copilot+ PC. Just as the TPM (Trusted Platform Module) went from specification to optional to mandatory, so too will powerful NPUs. In time, all PCs will come with a powerful NPU integrated into their processor packages.
But the PC market is an ecosystem. It includes developers who create, maintain, and update apps and services. It includes PC marker partners that create devices to sell to customers. And it includes those customers, who must evaluate the available offerings and make whatever choices when it’s time for a new PC.
I made the case last year that no one should buy a Copilot+ PC for the AI features. Instead, this platform, which started out as exclusive to the Arm-based Qualcomm Snapdragon chipset, was a solution to age-old PC problems like reliability, efficiency, battery life, and performance. Since then, AMD and Intel have entered the market with x64-based Copilot+ PC chipsets of their own. And while the resulting PCs do not meet the reliability, efficiency, and battery life of their Snapdragon competitors, they are both steps forward that close the gap somewhat. They are better chips than their predecessors. But again, you don’t buy one of these things because of on-device AI. That’s just something you get. And for most, I think, ignore.
And it’s not just Windows. No one–literally–is creating new NPU-centric apps for Windows that will run exclusively on Copilot+ PCs. Why would they? For starters, it’s difficult to imagine what such an app would even look like. But the target market is so small compared to the broader market for Windows PCs that limiting an app in that way would be financial suicide. It doesn’t make any sense.
What some developers are doing, and one gets the feeling that even this step requires serious cajoling from Microsoft and maybe Qualcomm, is building specific local AI features into their apps. In doing so, they emulate what Microsoft is doing on Copilot+ PCs. Instead of major new functionality, you get individual features that only work if you have an NPU or, more likely, work a bit more efficiently. But these features are lost in a sea of functionality. And they’re rare.
More commonly, we see developers adding features that can use local AI if available but working with whatever chips the customer’s PC has. That is, it will use a CPU or GPU if necessary. For example, Affinity Photo 2.x has a newish Object Selection Tool that uses on-device AI. You have to download a small AI model before you can use this feature, as is the case with some Copilot+ PC features. But it runs on—wait for it–the CPU. No matter which type of PC you have.
Why? All PCs have CPUs, that’s why.
So here’s Microsoft. It creates an artificial set of hardware requirements for Windows 11 in 2021 that alienates much of the user base for all the obvious reasons. In time, those requirements become more reasonable, in part because the security promises of the past are now a reality. But this strategy was apparently successful enough for Microsoft that it did it again with Copilot+ PC. And while AMD and Intel did jump on board, belatedly, they didn’t do so across their entire chip lineups. And making it happen quickly played a major role in Intel’s financial struggles last year.
This was unnecessary. Windows 11 could simply do what operating systems do and answer the needs of running apps and services by determining which hardware components are available on any given PCs and doling out the work to the right components accordingly. This is called orchestration, and if Windows 11 was properly architected for local AI, as it is for everything else it does, then the local AI capabilities that are today artificially limited to Copilot+ PCs would work with almost any Windows 11 PC.
Put another way, when Windows 11 first arrived with those artificial hardware requirements, some users complained that their out-of-date or TPM-less PCs could run the OS just fine, and they worked around the installation/upgrade blockers to prove their point. Today, those with modern PCs with integrated or dedicated GPUs and APUs are complaining that their PCs are more than capable enough to run those Copilot+ PC exclusive features, but they cannot do so.
Why we don’t have many workarounds to this issue is an interesting question. And while this will sound cynical, it’s likely true: None of the exclusive local AI features are compelling enough to drive any demand. Whomp-whomp.
It’s worth remembering that NPUs came up out of work that Google, Apple, and Microsoft did to create custom chips that could perform specific tasks on-device better and/or more efficiently than the CPU or GPU. For example, Google created an image processing chip called Pixel Visual Core for the Pixel 2 family that helped speed its nascent computational photography features. Apple was talking about the Machine Learning (ML) capabilities of the “neural engine” in its A-series chips for iPhones years ago and for similar reasons. And Microsoft created a set of video and audio effects called Windows Studio Effects that only ran on the early Windows-based Arm processors of the day.
NPUs make the most sense when efficiency is key, and that’s especially true for smartphones and other mobile devices. Likewise, the use cases make sense on mobile, too. We all take photos with our phones, and computational photography is how phone makers overcome the physical limits of the camera hardware to create incredible images and videos.
But it’s much more difficult to justify an NPU–or require an NPU–on a PC. Yes, most PCs are laptops these days, and, yes, efficiency matters. But so does perspective. Laptops have bigger batteries than phones and aren’t used all day, every day, out in the world and far from power. Most PCs are near power most of the time. Many PCs have other more powerful chips that could handle those tasks. And when you add up those realities and combine them with the scattered advantages of on-device AI while using a PC, well, it doesn’t add up.
There isn’t a single on-device AI capability in Windows 11 today that would cause anyone to dump their existing PC and buy a new one. There are, however, hundreds of millions of PCs out in the world that could run all or most of the current Copilot+ PC exclusive features … if Microsoft would just let them. And that doesn’t add up either.
Why Microsoft continues to limit Copilot+ PC features to those PCs with specific NPUs and other components is unclear. My guess is that it’s tied to the Qualcomm partnership and that AMD, Intel, and the PC makers on some level appreciate the upsell opportunity. But customers who spend extra to get a gaming PC receive an obvious benefit for the additional cost. What real-world local AI benefits does an NPU give us?
That’s semi-rhetorical, as there are some benefits, of course. But they’re minor and, in many cases, could easily be implemented more broadly. The new semantic search capabilities are a great example. Why does this require an NPU? A CPU might index that information more slowly, but … so what?
Further hurting matters, the most desirable, most capable, and most compelling AI experiences are all cloud-based and will run on any PC or device. You don’t need an NPU to use any of the features in Copilot or Microsoft 365 Copilot. And even if you have one, none of them will use it. ChatGPT is entirely cloud-based. So is Anthropic Claude. And Perplexity AI.
And that raises an interesting concern.
All the platform makers–Apple, Google, and Microsoft–are foisting local AI features on their users. This makes some sense on mobile, less so on PCs. But the standalone AIs, all of them, are purely cloud-based. Not so much out of necessity, but for the same reasons that any developer is right to ignore on-device AI today. The cloud works everywhere. Local AI does not. Cloud-based AI is powerful. Local AI is a tinker toy, comparatively speaking, or it’s specific to some task that’s just a feature of a bigger solution. The only advantage that local AI has, really, is that it works offline. But PCs are almost always online, just as they are almost always near power.
So Microsoft is limiting the availability of Copilot+ PC local AI features artificially. It is doing so in an attempt to goose sales of more expensive laptops to help its PC maker and silicon partners. But in doing this, it also disadvantages its best (Windows) customers, those that spent a lot of money on expensive PCs with powerful GPUs. And, if I’m right, most Windows 11 users.
And that’s enshittification, plain and simple. Copilot+ PC is correctly seen as yet another example of enshittification in Windows 11, alongside the ads, the tracking, the terrible Edge and OneDrive behaviors, the forced Microsoft account usage, and everything else. And it’s a particularly egregious example because it’s so unnecessary. Microsoft can at least make a good argument to customers about why signing in to a Windows 11 PC with an MSA is better for them. But it cannot do that for the Copilot+ PC features it denies to most customers.
It seemed obvious to me last year that Microsoft would open up these Copilot+ PC features to those PCs with powerful dedicated GPUs or modern CPUs with integrated graphics. But now I wonder. Mostly, I wonder what the point is. Developers will never embrace these capabilities if they’re not available more broadly, and most customers will never see them anyway because they’re limited to Copilot+ PCs they don’t yet own. If Microsoft is serious about pushing AI forward–and it seems like this is perhaps the singular point of this company today–it would unleash all that Copilot+ PC functionality for everyone, setting it free so that it can be used by the broadest audience possible. Microsoft’s problems with Copilot usage broadly–meaning the free cloud-based Copilot capabilities and Microsoft 365 Copilot–only make this need more obvious.
Lose the Copilot+ PC brand, Microsoft, and set Copilot+ PC free. Before it’s too late to matter. It’s the only way forward.
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.