From the Editor’s Desk: No Synthesizers! ⭐

No Synthesizers!

As a young fan of rock music in the 1970s, I would pore over my father’s extensive album collection in the years before I would start buying my own. Some of the packaging was art in its own right, including the working zipper on the front of the Rolling Stones’ Sticky Fingers. And many of these albums were chock full of information about the musicians and their music.

I was especially happy when song lyrics were part of it, as I still thought that these people were somehow elevated and had something to teach me. But I would read and re-read whatever there was, from the obvious to the cryptic. And one of my clearest memories of that time, mostly because it was so confusing, was that the early Queen albums boasted that they had been made with “No Synthesizers!”

This was a different era, obviously. But even as a child, I understood that Queen was taking a stand … against something. In the delightfully quaint technology of the era, the name synthesizer does a reasonable job of communicating what that thing does. And I vaguely understood that a synthesizer was like a piano, but electronic, and that it could be made to sound like other things. And so Queen was perhaps broadcasting that its music was somehow more real, more human-made. This was particularly interesting given the technical excellence of that music.

History shows us that Queen didn’t just step back from the “No Synthesizers!” but rather embraced synthesizers to a degree in the 1980s that was both hypocritical and hilarious. Songs like Radio Ga Ga are a celebration of synthesizers and synthesized sound, creations that the younger version of that band would have mocked just a few years earlier.

Queen’s cheeky u-turn on synthesizers is a lot of things, but the band’s stance against this technology in the 1970s now reads are misguided if not clueless. These are some of the smartest and most talented individuals to ever grace a stage, and yet they succumbed to a strange Luddite-like pushback on a technology that perhaps they should have utilized and innovated with. Fortunately, they got there eventually.

History often repeats itself, and I was reminded of this Queen story when I noticed that Epic Games founder and CEO Tim Sweeney tweeted his support for Steam and other game stores dropping their “Made with AI” labels. Yes, there are places for this kind of labeling, but game-making is not one of them. As Sweeney notes, “AI will be involved in nearly all future [game] production.”

Right.

If you like your fights nasty, AI is the perfect topic. The responses to Sweeney’s tweet are predictably all over the place, but this isn’t really a debate, as what Sweeney predicts is obviously correct. I’ve made the point that AI is fundamentally about saving time, but that could be expanded. It is also fundamentally about saving money. And, in a more nebulous sense about allowing us to achieve things that would otherwise be, wait for it, too time-consuming and expensive to even consider.

A familiar (if somewhat tired) example will suffice to make this point: I write some number of articles each day for Thurrott.com and each needs an accompanying graphic. Not so long ago, we paid for access to various stock photo libraries so that we had a selection of potentially useful images. (And there were free sources as well.) One problem with these libraries is that I can’t “own” any of the images I choose, and so I would sometimes see an image I had used on other sites as well.

Some of my friends are graphic artists and photographers, and I’m sure they’d be delighted if I used their work. But I can’t afford to do that, especially if it were some exclusive arrangement. They’re professionals, they rightfully expect to be paid for their work, and that work takes time. Some do contribute to stock photo libraries, some don’t. But there was never going to be a world in which I could call one of them some morning, describe a story I was writing, and then ask for unique artwork.

And yet, that is what AI is doing for me already. And then some, given that I typically get three or four images, each generated in seconds, to choose from and then can fine-tune a preferred image until it’s perfect. Each of these images is unique to me. Yes, other sites using the same AI with the same prompt may generate similar images. But not identical images, and the chances of this happening are now much more remote anyway.

In this one small use case, I am saving money by not paying for stock photo library access and by using AI is that is free to me. I am saving time because the images I ask for take just seconds to make, far less time than I used to spend browsing through stock photos. And I am achieving something that was previously impossible in that I can now keep directing the AI to remake images to my needs as many times as I want. The end result is objectively better, or at least it can be.

The creative asset needs of a modern videogame are exponentially bigger than my day-to-day need for images I can stick on the top of web articles, many of which are one-off news posts that will rarely be read just days later, if ever again. These are often multi-million operations, akin to Hollywood movies, representing many man-years of work.

That’s all very obvious, I hope, but one of the things that always gets lost in the debates around AI and job loss is that good technology–meaning, technology that has a net benefit to mankind–will always result in job loss. And keeping this just to the small corner of the jobs market I am waltzing around here, we are all familiar with the multiple generations of technological improvements that occurred over many years. AI being just the latest.

It wasn’t that long ago that we were debating whether an image was “real or Photoshop,” for example. And before that, it was airbrushes removing wrinkles and other defects. And before that, photography itself, a technology that threatened the livelihoods of painters. These and many other advances were all met with suspicion and derision, often by those who jobs were most directly impacted by change.

As a technology enthusiast, I’m fascinated by AI. As a business owner, I use AI to a small degree, mostly via the image generation capabilities noted above. But As a writer, I don’t use AI to generate words, and maybe I never will.

I do recognize, though, that writing doesn’t come naturally to many, and I routinely complain about the quality of the writing I see online and elsewhere. So this is clearly a need, and I don’t begrudge those who take advantage of AI-based writing assistance. I am not taking a stand here, will not be labeling anything here or elsewhere as “No AI!” or whatever. But that will happen elsewhere, of course. I suppose it already has.

The irony, of course, is that all writers utilize AI every single day, through spelling and grammar checking and other tools built into the word processor app they use. That app was once an affront too, as were the PCs on which it runs and the dedicated word processing machines, electric typewriters, and manual typewriters that preceded that. Hell, there are probably people writing articles or even books by dictating into their phones now.

AI is the ultimate synthesizer. Like Queen before us, we can be self-righteous, stand in the way of progress, and fret over AI hallucinations and AI slop and whatever else. But like Queen before us, we’ll all embrace AI eventually. It’s inevitable. And one day, we’ll look back and wonder what all the fuss was about.

I bet AI will be able to tell us.

Gain unlimited access to Premium articles.

With technology shaping our everyday lives, how could we not dig deeper?

Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.

Tagged with

Share post

Thurrott