In all the coverage of the Surface Go this week one thing has really stood out to me. No one believes the battery life claims. In large part it seems that this is due to the fact that they insist on using a looped video test, which, quite rightly, no one believes even roughly appropriates ‘real world’ use, and which historically has lead both Microsoft and their OEM partners to consistently over promise in this area. So in reading site after site saying the same thing it struck me that surly we are past the time when Microsoft should have developed an in house, offical Windows battery life test that will more accurately approximate the results you would get from the system under a ‘light productivity’ workload.
My vision is something along the lines of a script and an test MS account that run the system through a range of basic tasks. Opening the browser and running through a varied series of media rich webpages. Open Skype and have it start a group chat with a bunch their latest chat bots that just constantly ping messages to each other. Open the store and download all of Microsofts apps, Office etc. then have it download some giant, complicated spreadsheets from OneDrive, generate some pivot tables and graphs or something then save it back to the cloud. Do the same sort of thing with some image files, download them, run through some basic editing, change the file format and save it back to the cloud. Etc, etc, insert your choice of basic tasks here. Get a whole thing that takes an hour or two on an ‘average’ system (say i5 Surface Pro) then have it delete everything it’s done and then it starts over.
I’m not imagining that it would be something that would be available to the general public, as it’s not really something they’d need to do, just to OEMs and approved media outlets and reviewers. Though maybe you let the code be public so that it can be verified on what it does.
Obviously, it couldn’t exactly replicate the usage of everyone as every choice about what software they use would impact the results they saw. So that reviewer who uses Chrome rather than Edge and Slack rather than Teams etc would probably still complain that they saw different performance to the quoted number, but at least the difference would have to be less than the 25-30% variance that everyone seems to be expecting from the 9 hours quoted for the Surface Go. What’s more if all the OEMs could be convinced to use it as the standard test they quote for any new device then it would also allow consumers to be better informed as the numbers would at least be consistent to themselves from device to device.
Now I will happily admit that I have no idea how you would go about making this thing or what the limitations you could encounter with it might be, but it seems like it’s something that a company with the resources of Microsoft should able to pull off pretty easily and which would have significant benefits for themselves, their hardware partners and their consumers.
Just a thought.