Google’s press event today was curiously subdued and even boring at times. But there was one major highlight: The Pixel 4 series, which appears to offer computational photography capabilities that surpass anything that the rest of the industry has to offer.
That’s important because there is a perception now—an incorrect perception, based on my experience—that Apple has somehow caught up to or even surpassed Google (and Huawei and others) when it comes to the quality of the iPhone 11 Pro camera system, in particular in low light. I’ll have an iPhone 11 Pro Max review soon, but the short version is that Apple has absolutely improved its camera system in low light scenarios and has added a neat ultra-wide capability. But these are features the rest of the industry already had. It’s not “better” than the recent Pixels or Huaweis. It’s just in the ballpark now.
Or was in the ballpark, as today’s Google event suggests. It looks like the Pixel 4—by which I mean the Pixel 4 and the larger but otherwise identical Pixel 4 XL—have leapfrogged the competition yet again.
Before getting to the camera, which Google naturally focused on (ahem) at its event, it’s worth discussing a few other aspects of the Pixel 4.
First up is the price: The Pixel 4 starts at $799 and the Pixel 4 XL starts at $899. For those prices, you get 64 GB of storage, which is not enough. You will pay another $100 for 128 GB of storage, and there is no way to get more storage than that, either at purchase time or later, as there’s no microSD expansion.
Those prices are high, too high, in my opinion, given the quality of the previous three Pixel generations. But those prices also undercut the latest iPhone flagships significantly: A 64 GB iPhone 11 Pro will set you back $999, fully $200 more than Pixel 4. A 64 GB iPhone 11 Pro Max is also $200 more than the equivalent Pixel 4 XL, at $1099. (And no, the non-Pro iPhone 11, which starts at $699, is not comparable, despite its admittedly impressive internals. The camera system is less capable.)
There are three colors, Just Black and Clearly White, as before, and a new Oh So Orange, which is the version I preordered (with 128 GB of storage). The Clearly White and Oh So Orange variants are particularly stunning, but if you can’t get the color you want—The Oh So Orange model I preordered is already sold out, by the way—you can at least get a fun colorful fabric case.
We need to talk about the bezels, and in part because Google did not. The Pixel 4 has a large top forehead bezel and a smaller but still visible bottom chin bezel that remind me of Android flagships of two years ago, like the Pixel 2 XL or the Samsung Galaxy S8+. The larger top bezel contains a number of sensors, some of which, like Motion Sense, are important to the device’s new facial recognition capabilities (which Google also glossed over). So they’re perhaps necessary. I will say this, they’re preferable to the humongous notch that ruined the Pixel 3 XL last year.
But the most important feature to me, and to many Pixel fans, is the camera system.
From a photography standpoint, Pixel has thus far been notable for two reasons: These cameras take the very best photos in the smartphone market, always have, and they have done so, to date, using only a single rear camera in a time in which its competition has moved to two or three lenses.
(Google did add a second lens to the front-facing selfie camera on the Pixel 3 XL only last year for ultra-wide, selfie stick-less shots. They never talked about this, but Pixel 4 is back to a single front-facing camera lens that offers a 90-degree field of view, compared to 97-degrees for ultra-wide in Pixel 3 XL.)
It took a few years, but Huawei, in particular, but also Apple and others have indeed started to catch up. So Google has finally taken a step into multiple rear camera lenses with the Pixel 4. But in a time in which most of its best competition utilizes two lenses, Google is using just two, a 16 MP main lens with an ƒ/2.4 aperture and 52-degree field of view and a 12.2 MP telephoto lens with larger pixels, an ƒ/1.7 aperture, and a 77-degree field of view.
What’s missing, of course, is an ultra-wide lens. Google said at its event that it felt that good telephoto was “more important” than ultra-wide, which is a cute way of acknowledging a missing feature that most definitely is coming in Pixel 5. I disagree: Telephoto and ultra-wide are both important, and I really like the ultra-wide lenses on the other recent flagships I’ve used this year.
That said, the telephoto capabilities appear to be impressive, despite the fact that the Pixel 4 is limited to “roughly” 2X optical zoom, shades of OnePlus’s contorted claims about the telephoto capabilities of the 7 Pro. Here, as always, Google’s AI smarts allow for some computational photography magic that allow what is essentially hybrid zoom to deliver excellent results simply by pinching to zoom and taking snapshots. That looks solid, but I’ll need to test it.
Google got the most cheers for its so-called astrophotography capabilities, which not only let you take pictures of the stars at night but also the galaxies that are floating behind them. That stuff is absolutely impressive, but it’s not the type of thing you’ll use every day. And there are some more common photography improvements that are, to me, even more impressive and useful.
The first is a much-needed capability that I assume Google’s competitors will try to duplicate in their handsets next year: It’s called Live HDR+, and it uses AI to display exactly what an HDR+ shot will look like in the viewfinder. I know that sounds obvious. But when you are taking a difficult shot today—say, with bright light coming in through the window of any overly-dark room—the viewfinder will either blow out the light or overly-darken the dark, depending on where you focus. Now, Pixel 4 will show you exactly what the resulting shot will look like so you can tweak it, if needed, at capture time. That will impact everyone in a very positive way.
Another huge improvement is Pixel 4’s dual exposure controls. Today, when I’m out in the world and want to take a shot on a lesser smartphone camera and it’s not contrasty enough, I’ll use the on-screen brightness control to help compensate. But doing so only solves part of the problem; it doesn’t help with pumping up the shadows. So Pixel 4 offers brightness and shadows sliders on-screen, letting you achieve the look you want. On stage, they did a nice job of showing how this could be used to create a silhouette of people against the sky.
Pixel 4 also brings the automatic white balance control from Night Sight to all other shooting modes, so that snow shots will no longer be blue-hued because of the reflection of the sky. And it uses split pixels to deliver better Portrait mode shots, which are well-known in other cameras (cough, iPhone) for their terrible edge detection, especially with hair. Portrait mode improves in Pixel 4 because of the two lenses, and it can be used with larger objects now, and from further back too.
What this all adds up to is what looks like the very best photography experience in smartphones. But that claim will need to be tested. And I’ll start doing that as soon as my Pixel 4 XL arrives, perhaps as soon as late next week. Given my previous Pixel experiences, I am, of course, a bit nervous. But I’ll also hopeful that they got it right this time. Because the camera system capabilities look truly incredible, and a step above anything else we see in the market today.
Tagged with Google Pixel 4