But it’s a Mac


Last week I heard on Windows Weekly that Paul had been given a Mac by Intel. Not just any old Mac but an M1 MacBook Pro.

This week a preliminary review turned up on the site. A comparison between an Intel based PC and the new M1 Macbook Pro. I am certainly not a Macbook expert but, in terms of pricing and features, they seemed similarly equipped. I come at this as a PC user on Windows 10. Windows has been my OS of choice for many years. Primarily because I work in enterprise IT and you really need to use what your customers use and what you support.

Of course, the M1 Macs have come with some considerable hype. There are many YouTube videos promoting a revolution of ARM processors. So, I am interested at whether the hype is real.

I start from the position that, in business, the PC isn’t disappearing. Many home users have a PC budget that is nowhere near the price of any M1 Mac. On the other hand we also see that Chromebooks are making a niche out of appealing to a cloud based future and easy manageability. I also think that today’s Ultrabook designs are a direct response to PCs like the MacBook Air.

I think the failure Windowsphone and Windows still being embedded in just one form factor also says to me that people don’t really care about operating systems. People run applications. In the past Windows has benefitted from having a vast applications library. In business it benefits from running line of business applications. On MacOS you can get Microsoft Office and a number of capable web browsers. If this is all someone needs, then a new M1 MacBook Air is very favourably priced at the premium end of a home user budget. The same place as a Surface Laptop.

It seems to all boil down to whether you can use MacOS. That seems about it.

Comments (51)

51 responses to “But it’s a Mac”

  1. 2ilent8cho

    We have started rolling out the M1 MacBook Pro's to our staff. They are very responsive, the battery actually lasts close to what Apple says, we are getting 17 to 20 hours consistently out of them, and we have not heard the fans once on them, so nice and silent, even with Teams loaded! Played with some games on Steam with it like Cities Skylines and it runs silky smooth and quiet, and that game makes my 16" Intel i9 MacBook Pro with dedicated graphics get hot and fans go mad.

    Yeah it's just a Mac, but the fact it is Apple's low power, entry level chip that is effectively competing with Intel's Y series i3/i5 and can keep up with and in some case beat Intel's high end H series laptop chips that run at 45w, and this is only an entry level 10w chip is just impressive. The exciting stuff will be when Apple actually release their fast chips......

  2. hrlngrv

    The really bad news for MSFT is that Windows still has a vast applications library, and that's a problem because it means MSFT really can't afford to reboot Windows in some other direction. From my perspective, there's no way MSFT's main customers, enterprises, want fundamental change because training costs would be too high. Likely LOTS of home users don't want it if it'd mean they could no longer use certain applications they've been using for years, maybe decades.

    It's the unprecedented success of 3rd party Win32 software that serves as a straight jacket for MSFT. These days it eliminates far more future revenue paths for MSFT foreclosed in exchange for maintaining current revenue streams.

    MSFT is a victim of its own success.

    There are several obvious paths forward, but they ALL require something new NOT being called Windows. Which means MSFT would have to do the hard work of establishing a new brand, as well as convincing 3rd party developers to embrace MSFT's new platform in the case of an obvious chicken-and-egg situation. Where are the users before there's lots of software? Where's the incentive for developers before there's lots of users?

    Anyway, in my experience, decently written in-house software runs reasonably well under wine under Linux. Odds are the same would hold for emulation on Macs. Which leaves workplace software with only Windows variants, and these days that's almost exclusively MS Office (the Mac version is a joke compared to the Windows version). If one needs Office at work, one still needs Windows. If one doesn't need Office, one doesn't, in fact, need Windows.

    • fpalmieri

      In reply to hrlngrv: That's where the container work in Win10X comes in - everything will be running in a container in the future - Azure and WSL and even UWP all have some variant of that now for various pieces and parts - that's the way that legacy stuff is supported while providing a path for moving stuff forward if it is still being actively developed with Project Reunion, etc. - but MS has to pull it off - that's why it is good that they pulled it if it was half-baked - the half-baked efforts like WinRT don't work anymore for MS now that they don't hold all the cards!

      • hrlngrv

        In reply to fpalmieri:

        Picky to start: WSL 2 already runs via virtualization, so changing to a container would be a step backwards. Then again, WSL with full host filesystem access would be consistent with the claimed security advantages of Win 10X, would it?

        To the extent Chrome OS with Citrix receiver app has been around since at least 2014, allowing me to be able to do most of my particular job with a Chromebook, Azure under Windows 10X is NBD, indeed, truly underwhelming.

        Re containerization, if I use 2 different Win32 programs which use, say, Python as their internal scripting engines, would that mean each of those Win32 programs would need separate full Python subsystems? If so, MSFT needs to take a look at Flatpaks under Linux and get some pointers.

        Finally, moving forward, which developers would want to develop anything for Windows alone? Consider MSFT's R Open, always at least one, and often two versions behind FOSS GNU R, a major stats package for which many use add-on GUI front-ends (FWIW, I use R Studio). Did MSFT ever make any attempt at a UWP GUI for it? Nope. Why not? I'll let you ponder that.

    • dftf

      In reply to hrlngrv:

      It is mad how you consider iOS and the latest versions of macOS have gone 64-bit only (kernel and apps); as has Ubuntu and derivatives, like Mint (kernel only, not apps); and I have read that by the end of 2022, Google will only provide 64-bit images of Android that will only run 64-bit apps.

      Yet Microsoft still offers 32-bit kernel versions of Windows 10, that can run 16-bit apps.

      At-least for any device you buy online or in a shop it can no-longer have a 32-bit kernel version of Windows pre-installed, but progress sure is glacially slow.

      • james.h.robinson

        In reply to dftf:

        Enterprises need the backward compatibility.

        • dftf

          In reply to james.h.robinson:

          You know, I hear this all-the-time whenever I mention retiring the 32-bit kernel versions of Windows, so I really am curious: how-many large-enterprises still have 16-bit apps, or use devices that only have a 32-bit driver available?

          And for those that do, (1) why not just run an older version of Windows on those PCs, inside a segmented network, which doesn't connect to the Internet; or (2) why not install the LTSC versions of Windows 10 on them, as you clearly won't be using modern-features on them, like Microsoft Store apps, or things like the Xbox Game Bar or "Your Phone"? Surely old apps and devices would run-better on an older version of Windows?

          At the very-least, I think Microsoft should offer versions of Windows 10 that are 64-bit only... you're never going to get Apple M1 levels of CPU design on the PC side when all CPUs have to keep supporting the 32-bit instruction set...

          • hrlngrv

            In reply to dftf:

            At the very-least, I think Microsoft should offer versions of Windows 10 that are 64-bit only.

            You obviously don't use Excel much. In it's infinite wisdom, MSFT's Excel or Office development team(s) decided that 32-bit compiled .XLL or DCOM add-ins aren't supported under 64-bit Excel. There aren't many add-ins in wide spread use, but there are some in some industries (certainly in financial services).

            My point: you may be surprised by how FEW enterprise customers would buy such a version of Windows. OTOH, MSFT is almost certainly aware of how few would, along with damned little home user interest (or potential for disappointment a la Vista Ready vs Vista Capable), which may be the reason MSFT doesn't offer such a version of Windows.

      • Paul Thurrott

        Businesses aren't looking for progress, they're looking for compatibility. And Microsoft delivers it.
        • dftf

          In reply to paul-thurrott:

          As above: why don't they use the LTSC on such machines (given they only use them to run some old apps or devices, and won't be making-use of new features) or just run older versions of Windows on them, which would be better-suited to running old apps and drivers anyway?

          There's a reason retro-gamers on Windows often get-hold of old PCs and run Windows 98SE or Windows 2000, instead of running the 32-bit versions of Windows 10 and endlessly hunt-around forums trying to get fan-made launchers, patches and hacks...

          • waethorn

            In reply to dftf:

            LTSC is for appliance machines for single-purpose/app use, not general purpose computers (Microsoft even states that Office is licensed for use on it). It's also difficult to acquire unless it's already part of your company volume license agreement.

            You can build a special-purpose machine and install that version on it, but you have to sign up with an NDA to Microsoft's embedded channel to be able to order it, and it's not available from regular Microsoft Authorized Distributors where you normally get System Builder/Commercial-OEM software from - you have to get it from specialty companies that are Microsoft Authorized Embedded Distributors, like Advantech.

      • hrlngrv

        In reply to dftf:

        Yet Microsoft still offers 32-bit kernel versions of Windows 10, that can run 16-bit apps.

        The world domination of the late 1990s and early 2000s were founded on backwards compatibility. Now that's become MSFT's curse. Maintaining over 85% usage share for Windows requires that Windows be able to run older people's older software. FWIW, my wife still uses an abandonware knitting pattern editor last updated in 2003. For her, there is no newer acceptable alternative, and she and I have looked at damn near all the alternatives.

        It runs maybe not as fast but without crashing under wine under Linux. Do you believe MSFT would risk losing Windows usage to Linux for those who still want to run old Win32 or older software? Sure, there's a version of DOSBox for 64-bit Windows, but I've tried it, and I've tried dosemu under Linux. Not much difference.

        My point: if one wants to run a current browser and ancient application software, there are already cheaper AND just as reliable alternatives to Windows. If MSFT were to put any hurdles in those users' paths, MSFT would likely only succeed in demonstrating that Linux was indeed good enough for 30% at least of all home PC users, and that could drop Windows below 80% worldwide usage share.

      • wright_is

        In reply to dftf:

        That is down to corporate customers, mainly, and legacy software.

        We have lab equipment and production controllers that are controlled by Windows software. The problem is that the hardware is designed to last 20 years or more, but the control software doesn't get upgraded.

        You buy a new PC, with 64-bit processor and OS, that will be €1,000,000 for a new production line, thank you very much. Or €100,000 for new lab equipment.

        We even have an old serial metal printer that only has MS-DOS software and can't run in a VM. We bought a 20 year old version of the printer as a backup last year and we still keep old PCs around in case the current one dies. (Note: that 20 year old printer still cost several thousand Euros, a new one would cost between 50K and 100K, so a couple of thousand for a second hand one and being restricted to MS-DOS is a small price to pay.)

        That is the problem that Microsoft faces. They can't be like Apple and just leave their customers in the lurch and move to a new platform, forcing the customers to upgrade. Making the cheapest part of the equation incompatible with the expensive parts isn't an option for Microsoft, and the costs of a PC and Windows are negligible, compared to the PLC, bespoke software and hardware etc. Microsoft would be laughed off the premises, if they tried to tell their customers that they need to invest millions to replace perfectly serviceable kit, just because their new OS isn't compatible with the old one.

  3. joeaxberg

    I think we should just all just be thankful that we live in such a gilded age of computing. There are lots of different computers/devices that cost lots of different prices that have lots of different features and are designed in lots of different ways.

    That's a good thing!

    Whether Apple's ARM cpu's really are the shiznit are not, that Intel felt compelled to run ads against the Mac is also a good thing. I hope Intel is feeling pressure. They should be feeling pressure.

    When Apple switched to Intel, Intel was having its struggles then as well - big clock speed increases were no longer attainable, the switch to multi-core wasn't exactly trouble free. The original MacBook Air also came with lots of limitations.

    But they helped polish each others image.

    Apple represents like $1 billion of Intel's revenue (I think?), something they can certainly absorb. Still had to sting though I bet.

  4. waethorn

    In reply to lvthunder:

    Your Surface Book 3 isn't a desktop computer, nor does it have a desktop processor in it, so your argument is moot.

  5. wright_is

    In reply to lvthunder:

    The design for the M1 has everything on the same piece of silicone.

    Cores, GPU, ML, RAM, IO and firmware.

    The wafers aren't big enough to do that on desktop or server chips. 32 cores, plus 64GB or 128GB RAM and a high end workstation graphics card, raid controllers etc. just won't fit onto one piece of silicone. Their design won't scale to desktop and server levels.

    Our current servers have 2 x 16 core Xeon Gold processors and 512GB RAM, you can't squeeze that onto a wafer, and that is small fry, compared to high end servers.

    The same goes for the desktop, it just isn't economical to put everything onto the one piece of silicone, when you get to desktop scales. For a low end desktop, maybe, but real, high end desktops and workstations? Not with the current state of technology.

  6. waethorn

    Nobody has ever tried to scale ARM to desktop-level performance. Apple is doing it in their laptops and SFF designs but they haven't scaled up to a larger desktop chip yet. We'll see how well performance keeps up when they start competing against the current multi-core Ryzen Threadripper CPU's. Apple will come up with a way to reinvent the wheel, but you can't make a square wheel rotate like a round one, i.e. an ARM chip isn't designed to be put into an HEDT or workstation-class machine. Apple will figure out some way to make it work, but I don't get the feeling that Apple will build something similar to current desktop chips with a large heatsink and NVIDIA RTX-like GPU with lots of blower fans. Mark my words: whatever Apple comes up with, they'll have to convince customers that the way they build computers is the "right way" and everything in current Windows PC desktop system designs is "the old way of thinking" and somehow wrong so as to distract from the competition.

    • bkkcanuck

      In reply to Waethorn:

      No, they did not scale up to 'desktop-level' performance... so to speak.... they scaled up to super-computer level performance. Apple has an ARM architectural license, they develop their own chips. Fujitsu similarly have the same license and built the A64FX - used in the largest supercomputer in the world. Fujitsu use to use the SPARC license before. The team that Apple put into place to develop their chips came from PA Semi, which was created by chip designers who have previously worked on high performance chip design from years long gone (i.e. The DEC Alpha before Compaq bought them and they left to create a new company called Netscaler I believe, which in turn was taken over by Citrix ... which in turn lead to many going and creating PA Semi.

      • waethorn

        In reply to bkkcanuck:

        What are you even talking about? They don't have an M-series chip in a desktop, nor does it compare to Ryzen processors, nor do they have anything in a Mac server computer (do they even make those anymore?).

        • bkkcanuck

          In reply to Waethorn:

          Why are you changing what you were saying you said (from “Nobody has ever tried to scale ARM to desktop-level performance” to basically “Nobody has ever tried to scale M1 to desktop-level performance) ...  I suspect it was because your argument fell apart from that very first line. My response was regarding you boldly stating, "Nobody has ever tried to scale ARM to desktop-level performance".  I gave you one example of the largest super-computer in the world running on the ARM instruction set and that sort of obliterated your bold statement.

          One of the reasons why ARM has improved in the core design recently is the input that Apple has had regarding improvements in base architecture which would be fundamentally needed for the improvements in the ARM instruction set. ARM as an architecture and as an instruction set is now mature enough to use in servers and super-computers.  AWS Amazon Graviton 2 instances (if I remember right they provide the same performance as their Xeon based instances but at half the cost).

          One of the major limitations in the ability to scale up has always been with regards to efficiency of the processor design itself. The more power you add, the more heat it generates, and it becomes a fight to fit within the thermal envelope allowed or that can be dissipated. The M1 is very efficient and thus powerful for the thermal envelope it is restricted to. Each node size decrease has always resulted in more efficiency that can be used for more power within the existing thermal/electrical envelope or more efficiency when it comes to fitting more power within a smaller thermal envelope and battery size.

          In the end it is a balance of what you try to fit on a given wafer of silicon and what you leave out...   There are several inefficiencies in the x86 designs that are there because of legacy (x86 translation to microcode), there are also other inefficiencies that are there because of requirements by the marketing department which were beneficial for benchmarking and product differentiating. On balance through these tend to be wasted silicon that could be used for other purposes – especially when you have control of both the platform and the processors. 

          When it comes down to how well Apple chip engineers balance these. The fact that Apple is only building these processors for their specific needs means they have more room to maneuver and build better chips for their desktop or workstation computers.

          The x86 instruction does not make for the best designs. In fact, the x86 chip now is basically a completely redesigned chip (I think since around the P6), where they basically chucked their failing underlying CISC architectural design and replaced it with a more RISCish design with a hardware/microcode translation layer to translate between the x86 instruction set and the underlying micro-ops (basically RISCish instruction set). The reasons if I remember right had to do with the parallelization of the pipelines and the difficulty having one x86 instruction taking 56 clock cycles (or more) and another one 1 clock cycle which made it difficult to parallelize efficiently (along with the branch prediction).

          The advantage Intel had was often in their internal fabs being better than the competition -- that is no longer the case anymore.

    • wright_is

      In reply to Waethorn:

      There are already several chip manufacturers that have competitive ARM chips that compete with Xeon and Epyc. The current fastest supercomputer uses Fujitsu ARM server processors, for example.

      The problem is, ARM on the desktop is still very niche and nobody has been willing to invest in it, up until now.

    • hrlngrv

      In reply to Waethorn:

      Perhaps a tangent: I have a daughter who works in commercial space floor plans. Not really architecture, not really interior design. Anyway, she uses a lot of rendering software to provide clients with virtual walk-throughs. Most of the processing for that takes place on graphics cards rather than in the CPU. Couldn't Macs get a lot of work out of graphics cards? That is, if there are still any Macs in which graphics cards could be installed and function. I don't know because I don't use 'em.

      OTOH, it does seem if one's usage involves a lot more floating point numerical processing than average, RISC still isn't a competitor for Intel/AMD.

      • waethorn

        In reply to hrlngrv:

        Depends on the rendering software. If it's ray-tracing software, lots of that is still software-based because precision is important. GPU's are built for realtime rendering. They can cut down on calculations used for non-realtime rendering too, but of the architecture design software that has rendering and/or ray-tracing that I've seen, almost all of it is CUDA-based, meaning it's only for NVIDIA graphics cards.

        Now, MAYBE IF (big if) NVIDIA manages to buy ARM, they can license some of their graphics IP to Apple for discrete GPU's, if Apple is even interested (evidence shows that they're not).

  7. illuminated

    The best part about Macs was (and maybe still is) the lack of mandatory enterprise garbage that companies install on Windows computers.

    • dftf

      In reply to illuminated:

      That's not-necessarily a fault of Windows though: same-thing is true with preinstalled apps on Android devices. Both offer manufacturers flexibility to add things, which they get paid to do by the companies whose software they add, and thus the device can be sold at a cheaper-price to the consumer.

      And Microsoft did have something a while back called "Signature PC", where it would be a clean image, but not sure whatever happened to that.

      And sure, Windows 10 does come with a lot of "fluff" apps in the Start Menu by-default, but after doing a right-click > Uninstall on them, none of them have ever re-appeared for me, just like if you "Uninstall" or "Disable" stock apps on Android...

      • illuminated

        In reply to dftf:

        It is not a windows fault but it affects the perception. My own windows laptop is awesome. The company issued one even with better specs is a dog because of all the IT garbage that keeps fans spinning and CPU usage above 20% just doing nothing.

        • dftf

          In reply to illuminated:

          Sadly, for legal-compliancy, in many-areas of the world, IT have no-choice but to put various auditing, scanning and reporting software on there.

          A home-machine is likely to only be running an AV and possibly full-disk encryption, by comparison.

          And it's not just Windows: corporate iOS and Android devices often have to have apps installed on them using managed-solutions like VMWare AirWatch or Microsoft InTune

        • wright_is

          In reply to illuminated:

          You have the wrong IT department. We step out all the gunk and only hand it the machines with the bare minimum of software that is needed by the user.

          We start with a clean image and that gets AV, Office 365, and TeamViewer. That's it.

  8. curtisspendlove

    :: shrug :: It really always has been about whether you prefer macOS or Windows. This just gives a hardware boost.

    People who like macOS will buy it. People who don’t, won’t.

    A small number of people might be driven to try based on what people say about the hardware. Some of those people will like a Mac, some won’t.

    • dftf

      In reply to curtisspendlove:

      Yes: clearly the M1 is a good-thing, as it will give the industry a swift kick-up-the-backside, and hopefully create better CPUs all-'round. But I'd rather have a better CPU in a Windows device, personally, than switch to macOS

  9. jdjan

    Great post.

    The OS isn't as important as it once was. Gaming is really the only 'killer app' for Windows - but who the hell can afford to keep up with the hardware required? Besides, just get an XBOX. For most other mainstream computing needs you can use a Mac or a PC interchangeably. It used to be that MS Office on the Mac sucked and had incompatibilities, but those days are gone and the Mac versions are almost (but not quite) on par with their Windows equivalents.

    Where the M1 and it's successors matter is likely to be, as you state, better hardware at a comparable price. Apple doesn't play in the value end of the spectrum, so there will always be a cheaper Wintel to be had, but dollar for dollar, mid-level and high-end are going to start eating PC makers lunch. And most users won't care that it's MacOS.

    • james.h.robinson

      In reply to jdjan:

      Gaming might be the Windows "killer app" for CONSUMERS, but there are many apps in many industries that are both crucial and Windows-only.

    • dftf

      In reply to jdjan:

      As time goes on, I think AAA PC-gaming will become super-elite as spec-wise you're going to have games by the end of this decade asking for specs like "16GB RAM minimum; 24GB preferred; 32GB recommended", and asking for something like 200-300GB of hard-disk install space. Especially once they start offering 8K, 16K and 32K resolution-support and all the models, meshes and textures to support them...

      Eventually, streaming-services will take-over for AAA as it'll just make-sense to subscribe to them than keep paying to upgrade the RAM, GPU, CPU, storage and motherboard locally...

    • hastin

      In reply to jdjan:
      "...but who the hell can afford to keep up with the hardware required?"

      And yet, demand seriously outstrips supply over the entire market right now, across the whole spectrum. My local Micro Center is always at capacity. Lots of bored people at home looking for a new hobby - that's for sure.

  10. jimchamplin

    I just buy the computer that works the best. Linux runs on all of them :)

    • dftf

      In reply to jimchamplin:

      I believe there is even a distro currently that works on the new M1 macs... (though I'd be more-interested personally to see Windows 10 on ARM running on one)

      • prebengh

        In reply to dftf:

        You can actually run Windows on Arm on a Mac M1 using Parallels (Technical Preview, can be downloaded for free). Windows on Arm is a beta version, that can also be downloaded for free. However only 64 bit applications can be run, but they do run fine.

        • dftf

          In reply to Prebengh:

          Okay... I was meaning no native-boot is possible yet.

          Given Windows 10 on ARM supports AMD64, Intel32, ARM64 and ARM32 apps, what happens if you try to run an Intel32 (x86) or ARM32 app?

          • wright_is

            In reply to dftf:

            They don't work on the M1. It is purely 64-bit. Only x64 (translation) and ARM64 run on M1.

            To get x86 and ARM 32 running on the M1, Microsoft would need to build 32-bit emulation software (or parallels would need to write it into their VM code.

  11. dftf

    I don't think anyone would deny Apple have done a great thing with their M1 processor; but it's also not something that cannot be copied. I'm sure we'll soon see an AMD CPU which has the RAM integrated, and this would make sense for lower-end devices as I can almost guarantee the vast majority of home-users and small-business users never do any upgrades on their machines. (Using an external USB HDD would be about it.) So why bother to keep the RAM separate when most users would never upgrade it, and you can reduce a bottleneck?

    And such processors would likely run faster, and with AMD's ever-decreasing nm size, use less-power. So before-long, you'll see Windows devices attaining similar battery-life to M1 macs. (Though Windows continuing to offer 32-bit support, unlike current versions of macOS, may hinder things there: as CPUs having to have both AMD64 and Intel32 instruction-set support I'd imagine will somewhat limit what they can do to shrink the CPU designs more? Not a chip expert, so don't quote me on that, but I imagine 64-bit only CPUs will be more-efficient...)

    But as to what I would like: a better CPU powering a Windows device. Sorry, but I just don't personally see the need to use macOS for myself. I can do what I need to on Windows 10, and it largely works-fine for me. Sure, there's new features I'd like to see, and improvements to existing ones, but I don't get any blue-screens or issues following Windows Updates or any of that stuff other people seem to get monthly. Sorry your device is unlucky, but mine isn't, so I see no-need to switch...

    • shark47

      In reply to dftf:

      I’ve been lucky with Windows PCs. They’ve generally worked. I still got myself an M1 Air to see what the fuss was about and I think this is one instance where Apple deserves all of the praise the company is getting. I still prefer the Windows OS —I think we might be a dying breed— so i keep going back and forth, which is not ideal.

  12. F4IL

    I think msft have acknowledged the squeeze alternative platforms such as chromeOS and macOS will put on windows and are belatedly trying to respond as best they can. Lower price segments have been witnessing a slow and steady increase in chromebooks whereas the introduction of ARM macs inevitably renewed people's interest in the more lucrative mid-to-high segments of the market.

    On the topic of "but it's a Mac", I think it cuts both ways. Even after an arguably big hardware transition, the OS remains close to the principle of least astonishment which is precisely what users expect; things are the same as before but better and they "just work".

    • fpalmieri

      In reply to F4IL: Microsoft made a bunch of half-baked efforts to try to compete with various other platforms (all of Windows 8, Windows 8.1 with Bing, Windows RT, Windows 10 S, Windows Mobile, Windows Phone 7, 8.X, 10) that from a "software developer" perspective were different enough or required too much extra work to be able to extend your existing applications over to them given the various limitations (different XAML flavors, ARM, UWP, etc.) - they really screwed up with all these "alternatives" as a developer - most Windows applications today are still being developed on tech stacks (MFC, WPF, WinForms, etc.) from years ago. That's actually why the whole WinUI, Project Reunion, Win10 X, Blazor, MAUI, .Net 5 & 6, etc. makes me slightly "hopeful" as a primarily Windows developer going forward - on the development stack, tools side Microsoft really seems to have it's act more together than in a long time (VS Code has taken over the whole cross-platform editing market, Windows Subsystem for Linux and Visual Studio really have had great improvements for cross platform like the recent ability to use Linux crash dumps and GDB through Visual Studio). Microsoft has a lot of work still to do and have really run out of much runway to really make a meaningful change on the "client OS" side - Azure on the other hand is easily a big driver of a lot of this too but the developer story ties into the OS story. Apple has done an amazing job (looking from the outside) of moving both their OS's and their development story along in a very clear manner - I don't expect Mac's to rule the world anytime soon but Microsoft could learn from Apple as they try to kick Chrome OS to the curb in the markets where it is encroaching - a superior performance from the new Intel CEO wouldn't hurt either or a better partner on the ARM side than Qualcomm.

  13. illuminated

    In reply to lvthunder:

    Data caps are mostly US thing IMHO. Never heard about data caps in Europe.

    • matsan

      In reply to illuminated:

      Pretty common on business plans and the lowest level consumer. My business plan has 20GB/mon data (no carry over) priced at 549 SEK/mon but for our homestead we have an unlimited consumer plan (same carrier) for 588 SEK/mon. A normal month we push well over 300 GB over LTE and with the new 4K TV we will probably up that amount.

      The business plan includes data-roaming in the US and that was a good thing pre-Corona with 4-6 trips/year to the US.

  14. wright_is

    In reply to lvthunder:

    There is a difference between capacity (bits per second) and data caps.

    There is limited capacity and with everyone at home doing remote working and watching streaming video, the networks were suddenly running over capacity. But data caps won't help there, if everybody tries to watch Netflix at the same time, a data cap of 100GB a month won't change that.

    We get through around 750GB a month at the moment, no caps, no extra charges, just a flat €39 a month for telephone and Internet.

    At work, we have a synchronous gigabit line, with no limits, which is good, because we are the central data processing centre for all sites.

Leave a Reply