From the Editor’s Desk: What We’ve Lost ⭐

What We've Lost

Sometimes I have to remind myself how lucky I am. Keeping this just to personal technology, I am a member of the first generation of humans to grow up with arcade videogames, home videogames, and home computers. Our expectations grew exponentially with these innovations, just as they did for subsequent generations with the Internet, the mobile revolution, and now AI. Things constantly change, but sometimes they change a lot.

As a teenager and young man I became fascinated by how companies like Atari, Commodore, Apple, Microsoft, Digital Research, and so many others changed the world with the products they created and the industry they invented. I wanted to master the hardware and software, to learn how to write software code, and to use them all to make things. The possibilities seemed so endless and I was obsessed. Obsessed with the future and how wonderful it all was.

But there was also a sense of loss. By the mid-1980s, books were already being written about the early histories of Apple, Microsoft, and the others. Personal technology was evolving, of course, and getting better, but the origins of this industry were already in the rearview mirror. I could learn and improve along with the technology, but I wasn’t part of it all, I wasn’t there at the beginning. I had missed it.

But in evolving and improving, this industry gave me another opportunity, a way for me to be there in the thick of things when it all first happened. The rise of the Internet, the modern Internet in the form of the web, coincided with me deciding to go back to school to study software development, and choosing to do so in Arizona, where I would meet the person who pushed me into what’s still my career over 30 years later.

The sheer coincidence of it all cannot be overstated. As noted up top, I’m lucky. I stumbled into this career blindly and with no plan, and I’m part—a small part, on the periphery—of the history that unfolded and continues to unfold in this crazy industry. In some ways, I’ve seen it all. And yet I can be surprised and delighted when some truly impressive innovation proves yet again that change can be good, can lift up us all, can move humanity forward. Not always. But sometimes, and less now than ever.

What wasn’t clear over the intervening years is what we’d lost, though I had seen hints of this over time. But it was only recently, in researching topics related to the dawn of personal computing that will result in future Tech Nostalgia posts and, perhaps, even a focus month or two, that the problem suddenly presented itself clearly.

This industry, now run by some of the worst human beings on earth, was started by dreamers.

Some were engineers who worked at companies like Intel and Motorola and created the first microprocessors that went on to power the first microcomputers, as they were called. Some were hobbyists who saw these chips and invented the microcomputer boards and then systems that defined that first era.

Some were tinkerers who realized that these microcomputers would be useless without software, and that that software would need programming languages pillaged when possible from mainframe and minicomputer systems.

And some, perhaps most crucially, were individuals, people who saw the promise in all this even when there wasn’t a lot there, so to speak. People like me and, I bet, you. People who grew up in this world and just rode the wave.

Tied to these dreams was an inherent sense of hope. A big part of this had to do with where most of this innovation and new thinking was occurring: California, a place of eternal sunshine and good weather, of hippy vibes lingering from the 1960s and early 1970s. For a kid from cold, snowy Boston, California was itself a dream, an aspiration. It was a place where anything seemed possible, an easy enough belief when you’re on the other side of the continent.

Possible, yes. But it also required some work. If you bought a Commodore 64 in the early 1980s, as I did, or some other home computer system, you were on your own. You could plug in a cartridge and play a game or run whatever application. Or you could just turn on the system, stare at a blank screen with its “READY” text and a blinking cursor, and wonder what to do next. But the Commodore 64 came with a surprisingly good user guide that told you what you could do next. You could write a software program for yourself.

This capability was either uninteresting to you or the most interesting thing in the world. I was in the latter camp, obviously, a dreamer who read and re-read the Commodore 64 User’s Guide just as I would later read and re-read Charles Petzold’s Programming Windows a decade later. The Sprite Graphics chapter in that User’s Guide explained how one might create small, animated objects that existed on top of and outside the rest of the video display. It provided as an example a cute little Commodore balloon sprite that anyone who owned this computer will remember fondly. I created it and animated it around the screen again and again and again.

The key to all this, of course, was that the Commodore 64, like most other 8-bit microcomputers of the day, came with a version of BASIC, usually a Microsoft BASIC, built into its ROM. It was this BASIC that you would use to write your own programs, or maybe even your own games. Microsoft BASIC has been heavily criticized over the years, but the 6502 BASIC it created for the Commodore, Atari, and other 8-bit computers of that era was quite good.

As the personal computing industry evolved and inevitably centered on the IBM PC and then the PC compatible clones, everything changed. I eventually got a Commodore Datasette tape recorder so I could save the programs I wrote, and then later still a 1541 floppy disk drive. But 5.25-inch floppies would eventually evolve that decade into smaller 3.5-inch floppies in hard protective cases, higher capacity storage, and finally hard drives.

Microsoft BASIC evolved as well. It was bundled with the first IBM PC, of course, alongside its first PC-DOS, and that evolved into ever more powerful versions like GW-BASIC, QBASIC, and Quick BASIC. There were versions for other computers, like the horrible BASIC it made for the Commodore Amiga, but by that point the world was moving on to other more sophisticated languages, too, like Pascal, C, and even Assembly Language, each of which was more complex and powerful.

But then Microsoft did something that resonated with me deeply. Though it had C and Assembler versions of its own, and though Borland was showing up both with an innovation of its own called Turbo Pascal, Microsoft released Visual Basic, first for DOS and then for Windows. Per its name, Visual Basic was visual and easy to use. But it was also crazy powerful, a new way to create software programs that was accessible to just about anyone. It was, in its own way, a source of hope for the dreamers, those who wanted to create.

More to the point, Visual Basic was to the more sophisticated and expensive programming environments of that day what the Commodore 64 was to the IBM PC earlier that decade: A simpler thing that was also better and less expensive than the alternatives. Yes, professional computer scientists and programmers reacted in a viscerally negative way to Microsoft and its ridiculous “toy” language. But Visual Basic was the perfect embodiment of what Microsoft was in its early days, a scrappy upstart trying to change the world by expanding access to technology away from the white lab suit-wearing elite and giving it, Prometheus-like, to the masses.

I recall seeing an IBM PC for the first time in the early 1980s. My dad had bought one for his business, and it looked like everything a computer should be, professional and well-made. But as a C64 user at the time, I found the PC clunky and in some ways less powerful and sophisticated than my Commodore. It had a green, text-only display with no sprites or graphics of any kind, for example, and the only sound it made was a weak beep that compared unfavorably to a transistor radio. I was unimpressed.

At that time, I wasn’t aware of Commodore founder and president Jack Tramiel or his famous motto about making computers “for the masses, not the classes.” But I understood what he and Commodore were all about the moment I had figured out the IBM PC’s shortcomings. Here was this little 8-bit breadbox of a computer that cost under $200 by that time, and I very much preferred it to the incredibly expensive and more professional PC that was poised to take over the industry. I didn’t know what it was that Tramiel was trying to accomplish with those computers, but I did innately understand it.

Tramiel later quit Commodore after confronting the company’s chairman, Irving Gould, for illegally using company resources. He went on to acquire Atari, where a talented and small team of mostly ex-Commodorians created the ST line of 16-bit computers, repeating his desire to serve “the masses, not the classes” with a product that significantly undercut the price of the competition while being superior in meaningful ways. The ST was less successful than the Commodore 64 in its era, of course, but in reviewing videos and reading books and articles from that time, I’ve come to respect the ST and what Tramiel was trying to do even more than before.

Tramiel wasn’t a technologist. Far from it: His son Leonhard later revealed that the first computer his father was comfortable using was an iPad, and that was over 20 years after he had retired. But he was onto something. Through vertical integration and an incessant desire to do whatever the customer demanded, he gave us a series of inexpensive computers that would change the world though he would never learn or master them himself.

The Commodore 64 and the Atari ST were in many ways better than the ostensibly more professional competition of their eras, just as Visual Basic was later in many ways better than its more professional competition. What each of these things shared was a focus on real people, not engineers or other professionals, and delivering real value at a lower cost.

That combination of better and cheaper was always rare, which is what made these things special. But when you think about the modern era of personal computing, you quickly see that it’s rarer now than ever. Today, our industry is consumed by two forces that stand in sharp contrast to the magic of yesteryear, AI, which many don’t want, and enshittification, which literally no one wants.

Our industry, once a shining beacon of hope and futurism, is a cesspool today. We repeat sayings like “if you don’t pay for the product you are the product” as if they’re religious mantras that actually mean something. But what we should be doing is asking questions. Like what it means to continually pay for products that are now subscription services and not one-time payments where the cost keeps going up and the quality keeps going down, so much so that these services and the companies that make them are aggressively working against the best interests of their customers. How did we get from the freedom of paying for a song and taking it with us anywhere to continually paying for music each and every month for the rest of our lives? When did the dream that Silicon Valley promised and once provided become a literal nightmare?

I have some theories.

In my decades in this industry, I’ve come to understand that there are inflection points, moments where everything changes, even though it’s not always obvious at the time. In the course of writing the article series that became Windows Everywhere, for example, I came to understand that the point at which Microsoft ceded control of the personal computer industry it then still dominated came when it turned its back on the web platform it had previously championed and went back to its terrible proprietary ways. And in his book Hardcore Software, ex-Office chief Steven Sinofsky explains why that product line lost touch with the humans who use that software to focus on big businesses and their needs instead.

Today, we describe those types of transitions as enshittification, of course.

And to bring this full circle, sort of, I will point out that Microsoft ruined Visual Basic when it turned its back on web technologies. This was when it created .NET and the C# programming language, both undeniably more sophisticated than its earlier efforts, for sure. But this is also when it killed classic Visual Basic and came up with the completely unacceptable Visual Basic.NET, always a second-class citizen in the .NET era, and relinquished its interest in the mainstream. Microsoft did with .NET what it did with Office. It abandoned us all.

This isn’t really about BASIC, of course, you can see these kinds of shifts everywhere throughout the industry over vast periods of time. But these are some of the changes I remember the best because they ruined things the most for me. This is a form of nostalgia, a belief that things were better at some earlier time and worse today. And that can be dangerous, as our horrible U.S. government proves. But I don’t pine for an invented or imagined past that was allegedly better. I recognize that not everything was better in the past, and I see real improvements in more modern technologies of course.

What I am nostalgic for, what I do miss, is the hope and optimism that once defined personal technology. That California sun-kissed optimism has given way to genius Big Tech leaders bowing the knee before a president who is a curious combination of megalomaniac and imbecile, forever tarnishing their reputations and the ideals they supposedly stand for.

Given all this terribleness, the only optimism we have in this industry now comes from two places. The past, which can show us the way, and the Little Tech companies that are standing up to Big Tech and treating customers the way that Commodore and Microsoft, in bygone days, did. And this, of course, is where I find myself drifting more and more often. I guess it’s a coping mechanism.

But we can’t bring back the past. I look at things like Raspberry Pi, and I love how this organization is trying to give a new generation of enthusiasts and hobbyists the same sorts of experiences I had as a child. But Raspberry Pi, to date, has not come anywhere close to duplicating the pioneer spirit of personal computing in the late 1970s and early 1980s. It’s not their fault, to be clear. But Raspberry Pi isn’t better than mainstream computers, it’s just cheaper and underpowered. The OS it runs is uninteresting. And the programming language that Raspberry Pi is using to teach the world how to write software? Python. Dear God. It’s no BASIC, I can tell you that. And it’s certainly no Visual Basic.

There are, at least, hints of that hope for the future in things like Notion, Affinity, Proton, and other Little Tech solutions. Software and services that are both cheaper and better than the mainstream offerings from Big Tech. But the industry I grew up alongside has lost its way. It’s become, literally, Big Tech. That thing that Apple and Microsoft and others, especially, were fighting when they first arrived on the scene. As were companies like Amazon and Google decades later. I can’t recognize any of them anymore. And what I do see now, I don’t like one bit.

Gain unlimited access to Premium articles.

With technology shaping our everyday lives, how could we not dig deeper?

Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.

Tagged with

Share post

Thurrott