
Happy Friday! We’re coming up on a long Memorial Day weekend here in the U.S., so let’s kick it off early with some great reader questions.
christianwilson asks:
The push to integrate AI into everything got me thinking about energy consumption.
We are living in a time where we are told energy efficiency is critical, yet we are now seeing AI, which is currently far less efficient than a traditional Google search, being added to everything. I don’t think it is as bad as the cryptocurrency energy use (but admittedly haven’t done my research to know for sure) but it looks like we are going backward by using non-efficient tools before they are optimized. Sometimes i feel like companies and end users alike opt to drop their stance on climate/resource control for the sake of convenience. But maybe AI isn’t as terribly inefficient as I am led to believe. Do you have any thoughts on this?
I don’t know a lot about this, so here are a few observations.
Whatever the number is, AI queries cost some multiple of normal search queries, so, yes, one can assume that a big chunk of that comes in the form of energy consumption. On the flip side, one of the big pushes right now is in designing more efficient datacenter chipsets to make this kind of thing viable and more affordable. (As Microsoft is doing with Project Athena.) And on the client side, we’re moving from GPUs to NPUs for the same reason, and the resulting hybrid AI solutions should also help with that too, by offloading as much as possible from the cloud. I guess what I’m trying to say in my own uneducated way is that this should get better after the initial surge. We’ll see.
I don’t think we wrote about this, but Microsoft this month literally invested in a nuclear fusion startup, which reads like an Onion headline. This is about reaching its goal to be carbon negative by 2030, which seems like an impossible goal given all this AI stuff. But this is the company that sank a datacenter to the bottom of the ocean in a bid to see whether they could cool it efficiently. One wonders what a lot of those would do to the oceans and the earth’s climate, of course, but these things are always a problem for some future generation. Which is a problem.
I just read in The Washington Post this morning that NVIDIA is “suddenly one of the most valuable companies in the world.” Why? Because Big Tech companies are using their GPUs to run AI workloads. This company is about to surpass a $1 trillion market capitalization, and it’s now the sixth-biggest company in the world. I still think of NVIDIA as a company that makes graphics cards for PCs, but for reasons it never predicted, those chipsets were/are ideal in the pre-NPU world. It’s sort of like describing Nintendo’s success in gaming as a continuum from its start as a maker of playing cards. That is, its originators could never have predicted the successes to come.
We live in an area of Pennsylvania that largely consisted of big farms but those have been going away over time, I’m sure at first to some degree because of coal and steel and then these days because of warehouses for Amazon and other shipping-based companies. This change has really transformed this area for the worse. The highways are clogged with trucks and are all ruined, as are the areas near these facilities. We don’t have datacenters here to that degree, but I have to wonder what’s happening where these things are located. They obviously don’t have the truck issues, but they’re just sitting there like giant electricity-sucking cancers. Are they subsidized locally as the warehouses here are (stupidly)? I wonder. And as more and more jobs are lost to AI, we will need more and more of these things. Is this what the earth becomes? Trantor from the Foundation series? (Or, more recently, Coruscant from Star Wars?)
Obviously, I have no idea. But it’s right to question what’s happening. I hope there are more knowledgeable people than me working on this.
JustMe asks:
While these are technically questions, this is more of a thought discussion than traditional collection of Ask Paul queries. I understand that you will likely not be able to answer any of this, but I figured I’d ask anyway as it might lead to a series of interesting articles. With Microsoft’s push to get AI nearly everywhere in its products ,it got me to thinking about the “learning” (read – data) the AI collects/collates/uses – how or will that data be stored? Client-side or server-side? Shared? Can the user “clear” (delete) or “reset” AI data? Will AI such as Copilot, once it is rolled out, have access to everything on your machine – pictures, documents, financial data, health data, or will we be able to limit it? Will you be able to limit its influence/learning to specific apps? Can you turn it off completely? With Copilot – given Microsoft’s recent penchant for interrupting the user with pop-up notifications everywhere (I’m looking at you, Teams and Edge), how intrusive will it actually be? Will you be able to tell it to leave you alone, or will it become the next generation of hyperactive Clippy, extolling you to do whatever at the least opportune (to the user, anyway) moment?
This is another question (or set of questions) we are right to ask about AI, and it all falls under the same privacy umbrella that, sadly, few people seem to care about today and, worse, tech giants mostly address with what I call “privacy theater,” where the appearance of privacy is as ineffectual as protecting it as TSA is in protecting us while flying. It’s mostly just for show.
The problem here is the same as it is with anything else tech-related these days: to the typical consumer, the question boils down to whether it’s worth it to let Google (and its advertisers) gain access to your personal data so that you can enjoy the benefits of Google Search, Maps, or whatever else. And most people just say, yeah, it is. And so when I write about a topic like Brave, and why you shouldn’t use Microsoft Edge or Google Chrome because of privacy reasons, I get the occasional insane response, the best of which I already discussed somewhere: when I noted that Edge violates your privacy no matter how you configure it, one guy—who, unbelievably, is also a tech writer—informed me that, sure, but Edge lets him put a History button on the toolbar so he still uses it. Cue sadtrombone.wav.
To the specifics of your question, I think we can look at Microsoft’s privacy policies across its current consumer and enterprise offerings and believe that they will map directly to the coming wave of AI capabilities. That is, it will engage in privacy theater with individuals—we’ll see one more privacy On/Off switch in the Windows 11/12 EULA soon related to this, and then an inscrutable set of more granular controls in Settings that no one will ever look at—and it will just keep using telemetry data and your private information as it does now. And for businesses that pay it a lot more, it will allow them to store their data in their own tenants and keep the AI learnings internal. Microsoft will likewise adhere to the letter of whatever data governance laws worldwide, like the GDPR.
Whether this is outrageous or not depends on the individual, I guess. Consider the recent news that Microsoft has started searching for malware inside password-protected ZIP files stored in OneDrive. To me, that seems reasonable, since that would be an obvious way to transport malware online. But I saw headlines that stated that Microsoft was looking inside encrypted ZIP files, which is not true. So there will be sensationalism too.
Ultimately, what we probably need is AI-specific regulation at the governmental level, and that will be controversial in some circles. But I do think it’s telling that the biggest sources of AI now—Google, Microsoft, and OpenAI—have all publicly explained the need for this. Telling and maybe scary, too. And we need to look for ulterior motives there. Companies are never altruistic.
Jrzoomer asks:
There’s a rich history of pen computing devices going back decades. Over time, the dedicated pen support built into Windows 10/11 with Windows Ink and the hardware as well have gotten excellent. Despite this, for the most part the primary users for pen computing have been those in specific industries, such as art and design.
Do you see a future for pen computing becoming more mainstream? And could AI be the big unlock, given the possibility of training models for improved handwriting recognition and context-awareness?
No. But that doesn’t mean that pen computing isn’t useful or important.
I think the past 20 years of pen computing on Windows have proven that there are valid use cases for this technology, including those you mention, like art, but also note-taking for students and professionals, and probably some CAD/engineering-type applications. But at the end of those 20 years, pen computing hasn’t become generally popular, and far more people interact with mobile devices using their fingers, and with PCs using a keyboard and mouse. And so it has emerged as one of many ways to interact with computing devices.
The issue from a productivity perspective is that handwriting isn’t efficient. Everyone’s handwriting is different, and some are much more readable than others. It’s slower than typing, much slower. And as personal technology has taken hold, schools stopped teaching cursive writing. My kids, both now over 21, never learned it. And it is even slower to write in non-cursive block letters than it is in cursive. I used to have beautiful handwriting (and I was an artist too). But these days, my hand cramps up when I write a check. I went to the Post Office the other day to ship a package and it cramped up filling out the address form. I had to pause and shake it repeatedly to get through it. And the result was barely readable.
It’s good that pen computing is an option. But I think voice interactions will quickly overtake it, if that hasn’t already happened, and that smartpens will continue to be popular for specific uses but not for computing generally.
andrew b asks:
Why does everyone bring their dog to the grocery store now? Can we not?
LOL.
God, I could go on for a long time about this kind of topic, but I suspect I would just anger the more sensitive. So I will summarize: as I’ve stupidly tried to explain to others from time to time, no one loves you, your kids, or your freaking dog as much as you do. And none of you are as special as you think.
I’ll leave it at that.
helix2301 asks:
I know you have said many times that big companies like Microsoft get Apple envy. Wasn’t there a time Apple had Microsoft envy. I love technology history. I remember Jobs coming back because of Microsoft. I remember IE being on mac. I remember when Apple created there own office suite to compete with Microsoft. I remember when Apple made Safari for Windows and iTunes for Windows. I think people forget how Apple was desperate to catch up to Microsoft pre iPhone.
Because I focus on Microsoft, a lot of what I write does come from that perspective. Apple has copied Microsoft as often as the reverse, and it’s fair to say that Apple—channeling Steve Jobs—was envious of the success that Microsoft had when it was struggling. Today, of course, the tables have turned, given that Apple is the biggest company in the world. And so there is a lot of history rewriting going on that I’m not a fan of. And this has even impacted Microsoft.
For example, Microsoft partnered its way to success, while Apple’s success was largely about going it alone. Does either strategy obviate the other? No. In fact, one of the notable things about Apple is that its strategy has never worked elsewhere at scale. And yet, companies keep trying to copy Apple. Including Microsoft, which observed the rise of mobile and the resulting fall of the PC and responded by competing with its partners in the PC space, as if the partnering model was the problem in the PC market. A model that had not just worked for over 30 years, but was in fact responsible for Microsoft’s one-time dominance.
Other thing I wanted to ask was remember Novell. They were huge at one point until Microsoft kind of copied them with Windows 2000 Active Directory and Novell started a slow decline. The 2 companies still going strong despite Microsoft copying them is Citrix and VMware.
Yep. I’ve been re-reading some materials from the early 1990s regarding MS-DOS, OS/2, and Windows, and one of the side stories there is the rise of Novell and Netware and how this notion of a local network as a superset of a single system’s file system was a capability the Microsoft and IBM needed. A big chunk of the 1990s was about getting those capabilities into Windows, and then moving from workgroup computing to a client-server model with directory services (first WINS and then Active Directory). But Novell suffered from a major case of Microsoft envy and that was its downfall. It even tried to compete with Microsoft Office early on by buying WordPerfect and partnering with Borland (yet another company that couldn’t stand Microsoft). (Here, as always, companies are stand-ins for their leaders, Ray Noorda at Novell and Phillipe Kahn at Borland in this case.)
Most of these companies are gone now, but the two you mention, Citrix and VMWare, are perhaps special cases. It’s curious to me that Microsoft never destroyed or purchased Citrix—I think it was Mary Jo who suggested that it must have some dirt on Microsoft because it just doesn’t make sense—while VMWare is to virtual machines what Amazon is to cloud computing, a market leader that Microsoft never seemed able to beat. I interviewed VMWare co-founder and then-CEO Diane Greene around the time that Microsoft was bringing Hyper-V to market and was curious why the company didn’t sue Microsoft for antitrust product bundling. But she said that they had the better product and that the better product would win. That never worked for other Microsoft competitors to that day, but VMWare is still around. (Though, to be fair, it’s been passed around by various corporate overseers over the past 20 years too.)
Honestly, if there is a big takeaway to all this, it’s that diversity in computing is healthy. When it was just Microsoft running roughshod over the industry, lots of innovative companies got tossed to the wayside. Today, we have several Big Tech firms instead of one. It’s not perfect, and most of them still need to be curbed in some way. But it is better.
cwfinn asks:
As heart wrenching as it feels, can we assume Windows is now a dinosaur in Microsoft’s world, just (to mix metaphors) a cash cow to be milked to death?
There may be a healthier way to look at this and one that maybe even Windows fans can live.
I covered the Build 2023 day 2 keynote with Leo and Richard and at one point, I think it was Rajesh Jha, said something about Windows that I felt needed clarification: he wasn’t speaking about developing software for Windows, he was speaking about developing software on Windows. In other words, we’ve moved past the point where native software applications on Windows are the focus, but one of Microsoft’s messages is that Windows is the best place to be for developers. We can debate that. But I think it’s an important point: Microsoft has done a lot in recent years to make Windows attractive to developers. In fact, it’s kind of incredible.
Now, we might cynically note that saying that “Windows is good for developers” is akin to saying that “the Mac is going for graphics artists,” in that it’s a sort of backhanded compliment that really means it’s not good for anything else. But that’s not what I mean by that. When it comes to personal computing, we all know that most of the usage is happening on mobile, mostly on smartphones, and that even on desktop computing, that much of the usage is happening online, on the web. And so you can say that the role of Windows is diminished, overall.
But there’s more to it. Microsoft was smart to expand beyond Windows, and it started this work before Windows was diminished, with Office, and then with Windows Server. Today, I estimate that over half of Microsoft’s revenues now come from the cloud, but whatever the number, it dodged a bullet by diversifying. Should Windows continue declining, it will be OK.
But is Windows declining?
The Windows user base today is larger than it’s ever been. Windows revenues are down over a one-time pandemic high, but they’re higher than they were compared to pre-pandemic quarters. Microsoft cites engagement numbers I don’t trust, but whatever. The hard numbers we have are enough. Because that’s all that matters to Microsoft and the attention it gives this product.
Compared to its cloud work, yes, Windows is in the back seat now. You can see that literally at Build, where Windows is on day 2 and the exciting cloud and AI stuff is on day 1. But it’s still important. It’s still contributing tens of billions of revenues every quarter.
And Microsoft is not ignoring it: Windows is getting a Copilot just as is Edge, and Bing, and Microsoft 365. It’s pushing developers to use Windows. It has dominant Office productivity software that works best on Windows, software that bucks the norm when it comes to people using web- and cross-platform apps. It’s never going to be exciting like mobile, because it’s about work. But it’s important … because it’s about work. For so many people, so many companies, Window is work. It’s productivity.
I don’t see Windows or PCs going away anytime soon because, for that to happen, devices like Chromebooks (which are basically PCs), iPads, and Android tablets would need to get a lot more useful for the core productivity tasks at which PCs excel. And that could happen. Probably will happen. But it hasn’t happened yet.
I may write more clearly about this soon. I want to go back and rewatch the Build keynotes and view a few sessions first. But this is the first time I’ve felt good about Windows after a Build conference in a while.
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.