
We all know the story of how Steve Jobs pulled Apple back from the brink of bankruptcy and then orchestrated the biggest comeback in personal technology history by creating the iPod, iPhone, and iPad. The introduction of the iPad in early 2010 was interesting for a lot of reasons, but forgotten in all the mythology is the months-long wait between the iPad’s announcement in January and its release months later in April.
During that lull, I argued that Apple had erred by letting customers wait too long between the iPad’s announcement and its release. And that in doing so, it was giving them time to reconsider this expensive purchase and develop buyer’s regret. Many of them, I thought, would come to understand that the iPad was unnecessary and perhaps redundant.
That’s not what happened, of course And while it was impossible for a device like the iPad to duplicate the level of success Apple enjoyed with the iPhone, the iPad did of course go on to formally establish a new market for tablets, and it did so by shifting our expectations of what such a device was and could do. My blindness to this possibility was related to my being unique among reviewers. As a veteran of Microsoft’s Tablet PC efforts—I probably had more experience with this platform and used more of these PCs over the previous decade than anyone outside of Microsoft, literally—and I thought I had the iPad’s number. After all, I had seen it all, and it wasn’t like Apple was doing anything profound with a product I viewed as an oversized iPod touch. Except, of course, that by being focused and doing less, Apple was indeed doing something profound.
But that was the iPad and the end of an era. These days, when Apple announces new products, it releases them immediately or within a few weeks at most. And it’s unlikely that many customers ever rethink these purchases now: As we’ve often discussed, Apple’s audience consists of the most loyal customers imaginable, and they seem uniquely inclined to spend as much money as possible on whatever products and services the company introduces. This is what Apple’s competitors envy the most.
Of course, I am not the typical Apple customer. And not just as a reviewer, but as an individual, a user. I respect the company in ways that aren’t often obvious by my sarcastic teardowns of its hype-tastic product announcements, which are now expensive, Hollywood-like productions with Marvel movie-like special effects, further blurring the line between marketing and reality. But I am also by nature indecisive. I brood over product purchases for longer than most, and I immediately reconsider them as soon as I hit the “Buy” button. This is an uncertainty I do not associate with Apple’s most loyal customers.
I am also rarely completely satisfied with anything, and while some of this is surely just my nature, that’s fed by the real problem, which is the failings in the products I use. This isn’t an Apple-only problem, of course—my mixed feelings about Google Pixel hardware are well-understood, for example—but because these things are often quite expensive, and because it’s not clear that I will even use them regularly—the doubt creeps in.
And that brings me to the iPhone 15 Pro Max, which Apple announced on September 12, opened for preorders on September 15, and began selling to customers on September 22. That’s just 10 days between announcement and release, but despite my best efforts, my preorder wouldn’t arrive until October 4-9, stretching my wait time out to about three weeks. And giving me time to look at the reviews, which began appearing this past week.
As is always the case, the reviews are all over the map, with some praising the product’s upgrades and others panning them and downplaying Apple’s claims. This is typical. Navigating Apple’s marketing of any new device is complicated by the company’s habit of promoting evolutionary upgrades as being impressive, especially when compared to the prior year’s version. And so we, as consumers, need to navigate the propaganda, consider the inherent biases of individual reviewers, many of whom are gullible Apple fans, and decide accordingly. The days when objective reviews were the norm are long behind us, and you really have to be careful where you get advice.
After reading and, on YouTube, viewing several reviews, I came away with mixed feelings about my $1382.24 pre-order, which included the phone, a case, a charger, and a faster USB-C cable. Several reviewers pointed out that Apple’s “fine woven” case, the more ecological substitute for its previous high-end leather cases that I had purchased, was a lot less bespoke and scratched instantly, easily, and permanently. And that the titanium sides of the Pro models discolor easily when touched (but would return to normal when rubbed with a cloth). These types of quality issues are unusual for Apple, but they happen. And while I wondered whether they were precursors to bigger issues, I simply returned the case I had preordered and bought a similarly colored silicon case instead.
But it was the differing takes on the iPhone 15 Pro Max camera system that really gave me pause. And for a 24-to-48-hour period over late last week, a time in which I was admittedly consumed by Panos Panay leaving Microsoft, the historic Xbox leak, and the Microsoft AI special event—this purchase hung in the balance, stuck in the back of my brain. I was ready to cancel it and consider a lesser iPhone 15 model, or perhaps even not upgrade at all. And then amid all the chaos of that crazy, historic leak, I found the time to examine the iPhone 15 Plus again and see how much I would save and what I would lose in doing so.
It was pretty damn close for a little while.
Sometimes I explicitly purchase a product so I can review it and then return it, a practice that seems to bother some readers, which I find curious because I have no other choice; I’m not rich and I can’t afford to just buy everything. And sometimes I end up returning a product I purchased for review or otherwise because it didn’t quite meet my requirements. But there are also times when I’m certain that I will keep something I buy for review. And that’s true even if I don’t intend to use it full-time. This has often been true of iPhones in recent years, as I generally prefer Pixel phones despite their issues (and still go back and forth regardless). It’s often related to me needing to have a reasonably recent version of a product around for testing purposes. And I am always open to switching: If some product proves to be superior to whatever I’m using at the time, I will make that change.
On that note, I went into the iPhone 15 Pro Max expecting to keep it regardless of whether I would use it full-time, and you can see the truth of that in me electing to trade in my previous iPhone, an iPhone 13 Pro. I did not go into this expecting to switch platforms though, again, I am open to that and would do so if the overall experience was superior to whatever Google will deliver with the Pixel 8 Pro early next month. I certainly intended to give each of the phones the time needed to make that decision, and to keep both handsets regardless of the outcome.
But those reviews.
Many pointed out that Apple had said there were “seven lenses” in the iPhone 15 Pro/Pro Max camera systems, though I feel that even those most susceptible to this kind of marketing probably understood this wasn’t the case and that these phones each have three lenses, not seven. But I watched the announcement and couldn’t recall exactly what they said, so I watched it again.
Apple never made that claim at all. At its announcement event and in its press release for the iPhone 15 Pro series, Apple instead touts “the incredible versatility of multiple lenses” and how the main camera in these devices has “the equivalent of seven lenses.”
That’s a great example of Apple marketing. Camera lenses and their respective capabilities are a more technical and complex set of topics than most care to understand. Instead, most just want to take the best pictures possible in the easiest way possible. But then the iPhone Pros are, as their name suggests, a more professional product than the other iPhone models. And the iPhone Pro Max, in this case, really does take it to the max, if you will, with even more advanced camera capabilities.
And so we need to wade through some jargon. Apple’s misunderstood statement about “seven lenses” is its way of making the technical and complex more understandable to mainstream users, many of whom will buy a Pro model, lured by its advances over the non-Pro iPhones and regardless of whether they need or will use those features. That’s laudable. But what Apple didn’t say is, I think, as important.
Apple’s discussion of the iPhone Pro line’s camera advances came toward the end of the iPhone 15 Pro segment of the event earlier this month. It was also the final major feature discussion of the entire event. And Apple spent a lot of time on this topic because camera quality matters to its customers and, as executive Greg “Joz” Joswiak literally said during the event, it’s a key reason so many spend hundreds of dollars extra on a Pro model.
Apple, like Google, has long been promoting the computational photography capabilities of its camera systems, particularly in the Pro models, and its “Photonic Engine” is the marketing term for these capabilities. (Apple likes to brand things and it tends to use non-technical language, while Google, a more engineering-focused company, literally pushes the term “computational photography” explicitly and more often.) But the even more basic marketing is that the iPhone Pros take great photos thanks to a combination of hardware and software, which is absolutely correct.
On the hardware front, the iPhone Pros have “multiple lenses,” which in this case means three, or one more than is offered on the non-Pro models. And while most people buying these devices literally have no idea how this compares to the cameras that professionals have long used, it’s helpful to understand that they carry separate physical lenses, most of which are quite big, and can only attach one at a time to a camera body. What the iPhone and other smartphones achieve in much smaller devices, and without needing to carry, and then attach and detach, separate lenses is incredible, and only possible thanks to computational photography.
But we’re on the 15th iPhone generation now, and this product line has been in the market for 16 years. And for the first time, Apple introduced a new concept, that one of its three lenses can somehow provide “the equivalent of seven camera lenses.” And to make its case for this claim, it had to get technical.
Last year, we were reminded, Apple introduced its “innovative” 48-megapixel (MP) main (wide) lens in the iPhone 14 Pro family. This featured a “quad-pixel” sensor in which grids of four pixels are combined into a single pixel in the resulting finished photo, which is 12 MP, not 48 MP. This process is called pixel-binning, and it’s common today in the high MP count lenses in premium smartphone camera systems. It’s a nice combination of high quality and more practical file sizes, and it’s what most people want. But high-resolution camera lenses like this introduce other possibilities. And Apple touted one of them last year: The 2X zoom preset in the Apple Camera app actually uses “the middle 12 megapixels” of an image captured by the 48 MP lens instead of a separate lens, as was the case when Apple first introduced 2X optical zoom capabilities. (3X optical zoom in recent years.) It simply crops the image captured by the lens.
A few things were left unsaid in all that. First, the 48 MP lens’ pixel-binning capabilities are not used when an iPhone 14 Pro user snaps a 2X shot at 12 MP. And second, this isn’t optical zoom. It is, in Apple’s words, “optical quality at a familiar focal length.” And that term, focal length, went unexplained too. (More on this below.) Granted, if the quality is good enough—thanks again, computational photography—users won’t notice let alone complain.
A related question: When will Apple offer higher resolution ultra-wide and telephoto lenses too? Google is doing so, at least: The Pixel 7 Pro has a 50 MP main lens and a 48 MP telephoto lens, for example, and the Pixel 8 series will reportedly add a 48 MP ultra-wide lens, closing the door on the low-resolution past for good. But even the latest iPhone 15 Pros still have relatively low-resolution 12 MP ultra-wide and telephoto lenses.
Anyway, that 48 MP main lens is allegedly “more advanced” than that in the iPhone 14 Pros, which is weird because a quick comparison of its specifications—24-mm focal length, ƒ/1.78 aperture, second-generation sensor-shift optical image stabilization (OIS), seven‑element lens, 100% Focus Pixels—to those of the iPhone 14 Pro indicates that they are in fact identical, though Apple said at the event that the sensor in the lens used by the iPhone 15 Pro “is even bigger.” Since I can’t find any information about that little detail, and the more technical reviewers routinely found no differences, the verity of this claim comes down to software. And with the iPhone 15 Pro, Apple is for the first time also supporting the taking of “super‑high‑resolution photos” at both 24 MP and 48 MP.
Most mainstream users will never enable a 48 MP capture mode, in RAW or HEIF format, because of the resulting mammoth file sizes, but the iPhone 15 Pro defaults to a new “super-high-resolution” 24 MP format that’s a “practical” middle ground between the 12 MP past and its native 48 MP capability. This made me wonder what happened to pixel binning, but according to Apple’s website, this lens “combines the best pixels from a super‑high‑resolution image with another that’s optimized for light capture” to create a single image. And that is a form of pixel-binning, which is computational photography, a term Apple actually did use at this point in the event.
But this is where things get squirrely.
“Since the new Photonic engine now uses a 48 MP image, iPhone can shoot higher resolution photos in 24, 28, and 35-mm focal lengths, a very important range for pro photographers,” Apple director Misha Scepanovic said during the announcement. “They can quickly switch between these new options and even choose a new default lens, customizing their camera experience to their creative needs.”
Um, wait. What?
To be clear, these focal length choices all use the same lens. And because this presentation was prerecorded, and the speaker is a subject expert, this language was deliberate and not him misspeaking. For marketing purposes, Apple says that the iPhone 15 Pro has three cameras, each of which is made up of some number of lenses. But to me, and in Apple’s support document, the iPhone 15 Pros have a camera system comprised of three lenses, each of which is made up of some number of lens elements. And the main lens has 7 such elements, just as it does in the iPhone 14 Pros. These elements are the little curved plastic pieces that you see in those exploding diagrams Apple and others use, and they each bend light a bit differently, working together to achieve various degrees of focus, aperture (the amount of light, basically), and zoom so that the underlying sensor can turn the light it’s given into an image.
Apple probably explained the term focal length at some past event, but because of the design choices it made with the iPhone 15 Pro’s camera system—including the app UI that now lets users toggle between different focal lengths (on the main lens only) and even choose a default from three choices—it should have spent some time doing so during this year’s iPhone event. I’m no expert, but the short version is that focal length is a measurement, in millimeters (mm), of the distance from the center of a lens element to the sensor, and its value denotes the field of view (FOV, which is measured in degrees). And it’s important to know that the focal length of a smartphone camera lens is calculated, not real, because these systems are so small: The correct language here is “focal length equivalent.” Companies use these values because photographers understand them from their work with traditional cameras.
In any event, Apple and other smartphone makers have long indicated the quality and capabilities of their camera lenses by providing three values: The focal length equivalent (mm), aperture (by f-number, or what I thought of as an “f-stop” back in the day), and FOV (degrees). And so this new/not-new main lens in the iPhone 15 Pros is a 24-mm equivalent with an ƒ/1.78 aperture and what I assume is an 84-degree FOV. (Apple explicitly states the FOV for the ultra-wide and telephoto lenses, but not the main lens.)
Most of us are not professional photographers, and so these values will be confusing to many. But knowing a few basics really helps. For example, as you move from an ultra-wide lens to a main (usually wide) lens and then to a telephoto lens, the field of view shrinks as the magnification increases. Apertures with lower f-numbers let in more light than those with higher f-numbers. Human eyes have an FOV of about 180 degrees. And modern smartphones like the iPhones we’re discussing complicate matters further by turning the multiple focal length capabilities of their lenses into marketable features.
Consider the iPhone 13 Pro that I will soon trade in to Apple: The camera app on this phone has three zoom level choices right on the viewfinder, .5x, 1x, and 3x, and each choice maps directly to one of the three lenses in the camera system, ultra-wide, main (wide), and telephoto, respectively.
But the camera app in the iPhone 15 Pros displays four zoom-level choices—.5x, 1x, 2x, and 5x (for the Pro Max, the smaller Pro offers 3x). And since there are still only three lenses, that means that at least one of those represents either digital zoom or image cropping. And there’s more: users can tap that 1x choice to toggle the main lens between two other zoom levels, 1.2x and 1.5x, each of which has a different focal length (28-mm and 35-mm, respectively) than the default 1.x, which is 24-mm. So in a sense, there are really 6 zoom (for lack of a better term) choices (or presets) in the camera app.
In case it’s not obvious, these choices impact the resolution of the resulting image. The ultra-wide and telephoto lenses, again, are just 12 MP and so that’s the resolution you get when you select either choice. But while the main sensor is 48 MP, you can choose between 12, 24, and 48 MP images, with the understanding that those values only apply when you’re shooting at 1x. What happens when you use those 1.2x and 1.5x choices instead?
I would imagine that you get a lower-resolution image and a correspondingly smaller file size for those shots. I would further imagine that this isn’t a big problem since you’re shooting at 24 MP by default anyway, and so the resulting images should generally be of higher quality than what is/was otherwise possible using digital zoom at the native resolution. And that means that reviewers complaining about this process, which some do, are missing the point.
And if this sounds at all familiar, it might be because Google discussed how it uses the same cropping technique during the Pixel 7 event last year.
“This year, we re-engineered Super Res Zoom to go even further with high-resolution images,” the presenter explained. “Our next-generation Super Res Zoom allows us to roughly double the magnification of our optical lenses. At 2x, Super Res Zoom crops the 50 MP main (lens) using a high-resolution mode to provide a 12.5 MP image. It has more resolution than before, but the smaller pixels are also noisier [and fixed using computational photography], giving you high resolution and low noise. The result is a full 12.5 MP photo at 2x … similar to what you’d get with a dedicated 2x telephoto (lens).”
Interestingly, Google appears to go beyond what Apple is now doing by also combining images from the main and telephoto lens to construct images that are taken at between 2x and 5x zoom, creating “a composite that is as sharp as possible.” And above 5x, of course, you get what Google won’t call digital zoom, enhanced via computational photography.
Apple’s approach to zoom is a bit different. And that’s especially true of the iPhone 15 Max and its 5x optical zoom. This was rumored to be a periscope-style lens, which lets hardware makers fit in more lens elements than would otherwise be possible by arranging them vertically inside the device and using angled mirrors to let the light they work on reach the sensor. But instead, Apple did what Apple does and created a unique but similar system that it calls a “tetraprism” that stacks lens elements and mirrors side-to-side inside the phone. Close enough. What will really seal the deal here is its ƒ/2.8 aperture—“the largest of any smartphone”—and how well the OIS and auto-focus work.
But where the telephoto feels authentic, for lack of a better word, Apple’s embrace of multiple focal lengths in its much bigger main lens is both good and bad. Yes, it’s still a positive development for anyone who cares about photo zoom capabilities. And you don’t really need to know all this, I guess: Just use the camera app, figure out which default you want, and then use those four presets for most shots, manually zooming (with multitouch) only when desired. But strictly speaking, these other non-native focal lengths are really just digital zoom. Digital zoom backed by computational photography prowess, to be sure. But still digital zoom.
Does it matter?
In the end, I decided that it does not, at least on paper, and that I will need to actually use the iPhone 15 Pro Max in a variety of conditions to determine for myself how this compares to whatever Google delivers in the Pixel 8 Pro. But I am once again reminded of how devastating great marketing can be to the decision-making process, having walked away from the Apple event feeling impressed by the Pro camera features. And that inevitable wake-up call, this time delivered by reviews critical of these features, did make me wonder whether I should even bother. If it’s not any better than the Pixel, why not just get a cheaper iPhone and save a lot of money?
It’s a great question. But the only other new iPhone that interested me, because of its larger display, was the iPhone 15 Plus. This device also offers the Dynamic Island feature I’ve not yet tried and USB-C, which though a slower performer than the version in the Pros, is still USB-C. But the Plus lacks the more powerful Apple Silicon in the Pros with its gaming possibilities. There’s no Action button. And it doesn’t have the Pro cameras, of course. And I really do think it’s important to test all that too. Even if I go forward with the Pixel day-to-day.
We’ll see. But this is the kind of rabbit hole that my brain takes me down all the time. And now I need to stop thinking about this.
With technology shaping our everyday lives, how could we not dig deeper?
Thurrott Premium delivers an honest and thorough perspective about the technologies we use and rely on everyday. Discover deeper content as a Premium member.