
Happy Friday! I’m a bit later than usual because we had our house closing today, but let’s finally kick off the weekend with another great set of reader questions.
jt5 asks:
iPad Air Verus iPad Pro I recently had my iPad Air stolen in San Fransisco – and took as an opportunity to replace it with the iPad Pro (Got it on a really good deal from Costco- almost same price as the air). My only complaint about the air was I did not like where the fingerprint sensor was- it felt awkward when using it in a case. I like the pro’s facial recognition better. I am curious what your thoughts are on it- as I believe you are using the Air.
Yes. I bought an iPad Air in April 2021, so almost two years ago. But to put this in perspective, three things have changed since then: the iPad Air has switched to an M1 chipset (mine has an A14 chipset, the same chipset used in the iPhone 12 series), Apple has shipped a new iPad mini, and Apple has finally redesigned the base iPad with the same basic design as the iPad Pro, iPad Air, and iPad mini, with smaller bezels and flat sides.
I mention all that because at the time I bought my Air, this was the cheapest way to get the newer design, and I couldn’t bear getting another big bezel iPad. Had the latest iPad mini revision been available, I would have gotten that instead. And had the newest base iPad been available, I would have gotten that. I chose the Air because those options weren’t available yet. Plus it has USB-C instead of Lightning.
The thing is, I use the iPad exclusively for consumption tasks: reading daily and occasional video viewing while on trips. The ability to use this thing sort of like a laptop is not of interest to me, but if it was, I’d probably be looking at the bigger iPad Pro, which is quite expensive. Since it isn’t, I’d almost certainly just get the cheapest iPad I could today, so the base iPad.
As far as my current iPad Air goes, I have no major complaints. I agree the fingerprint reader isn’t ideal, but it mostly works fine. Mine doesn’t support Face ID, though it looks like the new version does (as do the iPad mini and base iPad). I suspect the screen is nicer than that on the iPad mini or base iPad, but I’m not sure if that would matter much for my use. If I can get another year out of this thing, and I’m sure I can, I’ll reevaluate the available choices then. But I wish there was a viable Android tablet I could consider. Maybe the Pixel Tablet? I doubt it with the first version.
The other question I have related to this is education related. I am thinking about getting another degree and have both a surface pro and the iPad. I prefer to the surface in almost every case- except for when I travel. There are many more apps available on the iPad for video content etc. Since I travel a lot- I will need to be doing course work when traveling. How well do you think the IPad will work for this? I know it is an opinion question- but I appreciate your feedback- as you also travel a lot.
I would bring both, personally: both are thin and light and it wouldn’t be overly bulky or heavy to do this. I routinely travel with a laptop and my iPad Air and having the choice between them is nice, as is the ability to use battery life accordingly. (Since I only use the iPad for consumption things, I can dedicate as much battery life as possible to productivity work on the PC.)
But you should evaluate how just an iPad Pro can work for you by trying to only use that day-to-day. The thought of using a single device, especially one that is that thin and light and with such terrific battery life, is quite compelling. It’s not something I could do. But if you can, I don’t see why not.
Daninbusiness asks:
It’s 2023 and we’re finally seeing some next-gen Xbox exclusives, in the sense that some new games aren’t being directly offered on the Xbox One platform. I really do want to play the upcoming Star Wars Jedi Survivor game… I haven’t yet upgraded. It’s not super urgent, I have an Xbox One X that works just fine. Both the Xbox Series X and S seem pretty good – I’m torn which one to get!
Get the Series S.
The Series X seems more future-proof and will have the better experience on our main 4k TV. The Series S offers some cost savings while should still be quite capable of playing new releases. Gaming is pretty much an activity just for me – maybe just a few hours per week considering family activities and obligations. My kids tend to be playing Minecraft on PC or Pokemon on Switch when they have an opportunity to play.
The Series X is more future-proof. But the Series S is so inexpensive, works so well, and is silent and tiny. It’s almost a no-brainer.
So I bet a Series S might be just fine, all things considered. Any perspective which might tip the scales for me? Thanks!!!
Microsoft sent me both consoles when they were first released and obviously I migrated to the Xbox Series X pretty quickly and stayed there. But at some point, maybe 1 to 1.5 years ago, I switched over to the Series S to see whether I could tell a difference. And I could not from a user experience perspective, though the Series S is smaller, quieter, and runs less hot. I realize my particular use case—which is basically to just play the latest Call of Duty—is not necessarily representative of the wider world. But given what you’ve told me about how you expect to play, I really do think the Series S is all you need.
I’ve never regretted switching. Not once.
jchampeau asks:
I’m curious to know what role, if any, you think overconfidence and the Dunning-Kruger effect play in tech journalism today—especially in mainstream papers and publications.
Sometimes it feels like that’s all that is driving tech journalism today. I can’t tell you how often I’m reading the news in the morning and remark out loud to my wife, who is surely bored of this by now, that “I hate my industry.” The writing is so terrible, so often.
In a recent episode of Hard Fork, a New York Times tech podcast hosted by Kevin Roose and Casey Newton, one of the hosts mentioned to the other that the use of AI is becoming more common and that LLaMA, Meta’s AI platform, can be run on an individual user’s PC. Then just a few moments later, a question about how AI could be contained or controlled to prevent some kind of AI doomsday scenario and the answer supplied was “by the AI platforms’ APIs.” Which seemed to me like a nonsense answer considering at least some of the AI tools can run on individual PCs which would presumably make API-based security measures easy to overcome. And it occurred to me that perhaps these two men, speaking as if they are experts or have done extensive research, don’t know what they’re talking about. Kevin Roose’s prior reporting on ChatGPT already raised questions as to his aptitude and suitability for writing on such topics, at least regarding Bing and ChatGPT, yet he speaks and writes with authority as if he’s no longer curious.
Yes. A million times yes.
I hate to go after individuals, but Roose’s ridiculous posturing after his first Bing chatbot experience—“I’m switching to Bing!”—was so obviously wrong that I had to call him out. That he completely renounced his original blathering a week later only confirmed my fears. This guy has no idea what he’s talking about. That he later coauthored an article called, and I’m not kidding, “Everything You Need to Know About AI” had me reaching to cancel my New York Times subscription. (My wife reads it too and was not amused at this attempt.)
Here’s the thing. It’s normal to suffer from Imposter Syndrome. It’s not OK to suffer from “I’m the expert” syndrome (as I think of it). But … we’ve all done it. For example, after a decade of Tablet PC experience, I thought I was coming from a position of authority when it came to opining on the iPad. Unfortunately, the iPad wasn’t a Tablet PC replacement (though it’s edging nicely into that world now), it was a new kind of consumption device. And my experience didn’t matter.
That said, I do have moments of clarity. When I was briefed on Microsoft’s quantum computing efforts a few years before the pandemic, I came back to the press room and wrote the beginning of the following non-article:
Everything You Need to Know About Quantum Computing
…is the article I wish I could write. But I can’t, because I do not understand it.
I passed my laptop around to Mary Jo and others nearby for a laugh. But we should always be this honest and self-aware.
I began to think about other people like Walt Mossberg who was so biased in favor of Apple I found it sometimes difficult to take what he wrote seriously.
I am fascinated and appreciative that you brought up this example as well, speaking of people I’ve had to bite my tongue over. I actually started an article about Mossberg many, many years ago and added to it from time to time but of course never published it because it was too mean-spirited. But there was no bigger one-sided windbag in our industry than that guy—well, maybe David Pogue, who was also a wife-beater, literally—and no one less deserving of attention. This guy sickened me. He should have played no role in our industry at all.
(Microsoft told me one time that they had visited Pogue to show off their Wi-Fi gear, which was quite good. When they asked him whether they could leave the hardware for review, he replied, “You can leave it, but I am never going to review it.” This was the New York Time’s tech columnist at that time. What a dick.)
Do you think the big, historic outlets these people wrote/write/podcast for give them a sense of overconfidence?
Yes I do. But I also think that writing for a major mainstream publication—NYT, the Wall Street Journal, etc.—brings with it an even great requirement for responsibility. And that this requirement is not being met.
As an aside, I have unsubscribed from Hard Fork.
I had never even heard of it. But I will decline to look it up. 🙂
helix2301 asks:
What do you think is the future of windows in the content creation space? I know so many friends that have abandoned Windows for mac in the creative space. Ant Pruit just recently switch from MSI laptop tops to Mac for reliability and ease of use.
The weirdest thing about what you wrote above is that bit about MSI. MSI? Who uses MSI?
But this idea that creators—especially those people who work primarily in video—might gravitate to a Mac has some bearing in reality: a big chunk of this market does gravitate to the Mac, whether that makes sense or not. I am sure there is data somewhere that shows that an Mx-based Mac can encode video or whatever faster than most PCs and do so while eating up less battery life. That is definitely a thing.
That said, I use PCs almost exclusively for those activities in addition to my normal work, and I don’t see any need to switch to a Mac. Indeed, the HP Dragonfly Pro I’m using right now competes head-to-head with the latest MacBook Pros from a “power per watt” perspective, which is what all the Apple guys seem to care about the most. More on that in my review, but modern PCs are fantastic.
I think a lot of this comes down to real-world needs, experience, capabilities, and costs, the latter of which is about much more than the one-time price of a machine but also includes all the software switching costs, etc. And it depends on what you are doing, what software you’re using, etc. Is this your career or a side thing? So many questions.
I know myself I got sick of struggling with audio issues and only use my mac for podcasting, game streaming, financial etc. Because of reliability it boots up and works everytime.
Having experienced that with PCs, I’m not sure what to say. But again, it’s about personal experience and need.
I use windows for all my coding projects just always make sure to backup to github encase of system crash which has happened because of windows update which you cant turn off. I rate Windows as the best platform for developing because of its openes and ease of use but I would never rate it best for any of my content creation. In PC gaming Windows is King without a doubt as well Mac cannot even touch Windows. Is this just Apples nitch?
In so many ways, yes. But a lot of that is perception.
We were in Mexico recently and went to a sushi place and ran into three other Americans, one of whom was very curious about a lot of stuff, and one of the questions he asked when he found out about what I did for a living was ask me what kind of computer I used. After answering, I asked him what he used and had to force myself not to ask “What kind of MacBook do you use?” because I already could tell he was that kind of digital nomad content creation guy. And sure enough, it was a MacBook Pro, which he explained was much better than his last computer, which was a never-ending series of problems, none of which were ever fixed properly. Intrigued, I asked him about that previous computer. What was that? It was a previous-generation MacBook Pro. Hah.
My wife does photography and has always use a high end windows machine she recently got a cheap macbook air and she says Adobe runs better on it then Windows? I have heard this from many people is there something special Adobe does for Mac?
Yeah, Adobe has always done a great job of optimizing for Apple Silicon for whatever reason. I’m not sure what it means to “run better,” so to speak, as I use Photoshop and Premiere Elements on my PCs all the time and they work great. But again, there is some weird special connection between Apple and creators and whether it’s real or imagined or some of both, whatever, it’s there.
My friend just bought a Mac because AutoCad is coming to M series macs I never thought I would ever see this day. What are your thoughts?
Apple Silicon chipsets are not enough for me to switch to a Mac. But I get it: these machines do deliver great performance and terrific battery life. They’re still Macs, of course, so they have all the familiarity and functional downsides I don’t like. But Intel is clearly gunning for Apple with its hybrid chipset designs and as I alluded to above, the Dragonfly Pro, which uses a modern AMD chipset, has its own answer to that dynamic that I find quite compelling.
The thing is, the Apple Silicon chipsets haven’t improved the Mac’s standing in the broader market all that much. At the end of 2021, the Mac had 7.8 percent share, the same as the year before. At the end of 2022, it had 9.8 percent. So, a small but meaningful gain. Maybe the way to look at this is that Apple has strong mindshare and probably usage/market share with content creators. But it’s not like Windows isn’t a big deal in that market too. Probably bigger than the Mac, actually, when you factor in non-artistic creators like bloggers.
andrew b. asks:
Say you had to live in your Mexico City apartment permanently and you absolutely had to buy a car there. What do you buy?
A VW Bug in as close to mint condition as possible. Which is quite possible there.
I’ve owned three VW Bugs in my life, including my first car, a 1972 Super Beetle. I also owned a 1973 Beetle and a 2000 New Beetle (which went on to be my son’s first car). VW had to stop selling the classic Beetles in the U.S. in the late 1970s because of emissions, not safety, as many believe. But they were made and sold in Mexico until 2019 (!). The last Beetle plant was in Puebla, Mexico, just two hours south of Mexico City.
I used to be an expert on this car and could identify the model year of almost any bug correctly based on the often very small physical differences from year to year. But that’s lapsed, and I have little understanding of how the past 30 years of Mexico-based bugs differed aside from a vague understanding that emissions issues made the more recent ones quite underpowered. So I would probably want a true classic, albeit a newer version, so something from the 1970s. Kind of the sweet spot for these vehicles, I guess. At least for me.
yoshi asks:
Do you still plan on moving back to Pixel? Or did you make that move already?
I did switch back to the Google Pixel 7 Pro after we got back from Mexico City. So far, so good, but like everything else in life, there are pros and cons.
Did you watch Arrested Development? If not, it helps to know that there was a running gag on that show where various characters would say, “I’ve made a huge mistake.”
With that in mind, the morning after I switched, I picked up my phone in the morning and it failed to sign in via facial or fingerprint recognition, noting that I had to “sign in with a PIN for increased security.” Knowing that facial and fingerprint recognition are increased security, I muttered “I’ve made a huge mistake” and filed that one away in the cons column. But it’s happened several times since then. Seriously, Google. (Adding insult to injury, you can’t just type your PIN, you also have to tap an “Enter” button too. Seriously, Google.)
The grass is always greener, I guess. But the goal is to use the thing that causes fewer moments like that and has the better overall experience. I’ll stick with the Pixel for now.
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.