One week after Microsoft posted its first “vision” video for the future of Windows, the man directly in charge of the product has weighed in. And as you might expect, he also sees AI being key to coming advancements in the “desktop PC platform.
“Today, we’re on the journey of building the next generation of Windows for the breadth of our customers,” Davuluri says in an internal video interview. “That includes work on Windows 11, Copilot+ devices, Windows 365, and Azure Virtual Desktop.”
To be clear, this video is not part of that “vision” series, which still consists of the one video. That video was produced by the Windows team, while this more recent video with Davuluri is from the Windows IT Pro group in Microsoft 365. It starts off with stupidity related to recent books read and favorite emoji (seriously) before finally turning to what we’re all here for.
“The interaction experiences on Windows are going to change, the business models are going to change, and the experiences are going to change in the next coming couple of decades.” Davuluri finally says after a tedious discussion about finding the time to catch up on favorite podcasts. “When I think about the work we are doing in Windows, we continue to be fueled by that ambition of making transformational changes that are impactful across the planet, across the range of customers and use cases.”
“[When] I think [about] what human interfaces look like today and what they will look like five years from now, one big area of thrust for us is that Windows continues to evolve,” he continues. “The operating system is increasingly authentic and multimodal, with voice and vision and pen and touch, just like we use a mouse and keyboard. So that is an area of tremendous investment and change for us and its evolution here in 2025.”
So, that’s all pretty straightforward and as expected. And David Weston said much the same in that “vision” video. But the difference here is that Davuluri is in charge of Windows product development. So him discussing these changes–some now in progress in Windows 11–carries a bit more weight.
When asked about Copilot+ PCs and how AI has essentially solved problems with file search, Davuluri said that having powerful AI models that can run on the PC itself is “transformational.”
“They [the on-device AI models] bring a bunch of new capabilities and agencies to the platform and the device itself,” he says. “Multimodal interaction is one component of it, and ideas like voice and vision becoming available are one construct. The second big construct is, once you have these models that have reasoning capabilities, your tool orchestration capabilities on the edge, you can now start building new OS primitives that in turn bring new features and capabilities of the operating system to customers … Click to Do and an improved version of search are two examples of these new capabilities. Improved Windows Search gives you the ability to take what was a traditional lexical indexer in Windows for over two decades, and now augment that lexical indexer with a semantic indexer.”
“The beauty with the semantic indexer is it has the capability to understand content that is being searched in addition to keyword search,” he continues. “And so search is much more meaningful. It is much, much more impactful. It has a better chance of returning results. It is aware of different modalities of content. And the same is true for Click to Do the ability for us to be able to find users in the flow of their tasks and have the AI models then help them with the intent and flow of their tasks in themselves, and reduce friction and interruption across the OS, across apps, between the OS and the apps. [This] is really our intent with AI, to be able to give folks the ability to super-power their productivity on a daily basis.”
Davuluri then touches on the Settings app, which is now getting an agent, based on local AI models. (This Copilot+ PC exclusive feature just shipped this past Tuesday in the latest monthly cumulative update for Windows 11.)
“Settings is a representation of over 25 years of Windows engineering work, and now we have an agent in settings powered by Mu [a local AI model] that allows people to use natural language interaction to complete a task, and allow the operating system to go find what those steps are and to execute on the user’s behalf, versus having to be able to go do that by somebody being proficient with it,” he says. “So you can just interact with Settings and then ask, hey, fix this mouse cursor size for me. And it just happens. It’s really cool.”
Davuluri also notes in the video that “taking advantage of Windows in the cloud,” meaning Windows 365 and Azure Virtual Desktop, plus Windows 365 Link devices, is “a big area of evolution,” though I don’t see these as a viable replacement for a real Windows PC for most users, regardless of the use case or customer. But there is a distributed element to these products that combines local PC power with cloud-based functionality that will impact the Windows experience for everyone in the coming years.
“With the new Windows app and all these experiences to make Windows adaptive, or hybrid, local AI and cloud are coming together,” he says. “It’s really changing the way Windows has been working on a physical PC [going back] 40, 50 years … Customers just want a simple, secure device. They want simplicity in the device management. They don’t need local data on the system.”
But the big vision statement here, so to speak, comes towards the end of the video when Davuluri is asked about how or whether the way we interact with PCs is going to change because of AI.
“We will see computing become more ambient and more pervasive,” he says. “[It will]” continue to span form factors and certainly become more multimodal in the arc of time. We started with the notion of a desktop and a keyboard and a mouse and a monitor. We’ve gone through several revolutions, several technology paradigm shifts. And today one of the things that we celebrate in Windows is the diversity of form factors in which computing is available.”
“Experience diversity is the next space where we will continue to see voice becoming more important,” he continues. “More fundamentally, the concept that your computer can actually look at your screen and is context aware is going to become an important modality for us going forward. The other thing [that’s] going to get more intuitive is multimodal interactions. So you’ll be able to speak to your computer while writing, inking, and interacting with another person … [Finally,] the compute will become pervasive. Increasingly, the Windows experiences are going to use a combination of capabilities that are local and that are in the cloud. It’s our responsibility to make sure they’re seamless to our customers.”
As for the AI doubters or those worried about this new technology, Davuluri’s advice is straightforward.
“My suggestion is to try it,” he says. “When we think about Windows, we have a broad range of customers. We have a set of customers who deeply understand AI, helping us build products, giving us feedback. And then we have a cohort of customers instead of users who are early to the AI experience or have little understanding of AI. One of the things that I find delightful today is the way we are now starting to integrate Copilot as a companion for you in Windows itself. It’s available in the Taskbar, distributed across Windows 11. It allows you to get to the kinds of tasks that make sense for you in your flow of work. And so I think having Copilot being available to you is a great starting point, and just experimenting and playing with it is one good option.”