The Pixel 4 XL Photography Advantage

There’s been a lot of talk about how Apple and other smartphone makers have caught up with Pixel and Google’s awesome computational photography innovations. But while I was out testing the Pixel 4 XL camera system today, it occurred to me that there is a big difference between Google’s approach to photography and how its competitors implement similar functionality. And this difference is, I think, still key.

And the interesting thing is that Google sort of communicated this during its Made by Google ’19 hardware event last week. During the only truly interesting part of that presentation, Google Research’s Marc Levoy was highlighting the awesome zoom capabilities of the Pixel 4 XL when he specifically called

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

“By the way, super-res zoom is real multi-frame super-resolution, meaning that pinch-zooming before you take the shot gives you a sharper photo than cropping afterward,” he said, after showing off a zoomed-in shot of San Francisco’s Golden Gate Bridge that elicited gasps and applause from an audience that had been awfully quiet until that point. “Don’t crop. Compose the shot you want by pinch-zooming.”

(And if you haven’t, I strongly recommend checking out Levoy’s part of the presentation, which starts  about 47 minutes into the video recording of the event.)

That moment really stood out for me when he said it live, as did a subsequent comment ultra-wide zoom, which the Pixel 4 XL regrettably lacks. But now that I’m using the Pixel 4 XL, I noticed something that should have been obvious but wasn’t: Unlike virtually all modern smartphones I’ve reviewed recently, including the Apple iPhone 11 Pro Max, Samsung Galaxy Note 10+, and OnePlus 7T, the Pixel 4 XL does not include dedicated on-screen buttons in the camera app to switch between each lens the handset offers.

Think about that for a second.

With these other phones, when you want to use optical zoom (typically 2X), there is a dedicated telephoto button you can use to ensure that you’re getting that optical zoom with no digital zoom trickery. Yes, you can further zoom, and possibly lose clarity while pumping up the noise, by pinching and zooming from there. But these other handsets go out of their way to prevent you from doing that.

Google takes a different approach. There are no onscreen buttons in the camera app that let you specify either of the two lenses the handset offers (telephoto and wide). You couldn’t tell the camera app to use only optical 2X zoom if you wanted to.

Instead, the Pixel 4 XL always uses computational photography. Google, the real innovator in this field, is literally staking its claim to photographic excellence on its ability to AI the hell out of any scene you’re photographing. So you can pinch to zoom, and heck, maybe you’ll end up at exactly 2X zoom. But even that shot will be processed to be as good as it can be.

What you do get are onscreen sliders, for zoom, brightness, and, new to Pixel 4 XL, shadows. So you can tweak some aspects of the shot at the time you take it, and that’s where your creativity comes into play. But what you can’t do is try and outthink the onboard AI when it comes to zoom. There, Google has your back. You mucking around with that could only screw up the shot.

As of now, I can’t claim that the Pixel 4 XL is absolutely better as a camera when compared to, say, the iPhone 11 Pro series in particular. That opinion will come in time, and we’ll see where it lands. But what I can say is that Google’s use of computational photography is almost certainly an advantage it retains over the competition. And that each year it keeps raising the bar.

But yes, the lack of an ultra-wide lens is a mistake, and a is something Google should have added this year. Maybe we’ll get one with the Pixel 5 series in 2020. And see where else the firm’s computational photography prowess has taken it.

Tagged with

Share post

Please check our Community Guidelines before commenting

Conversation 22 comments

  • dsilverman

    23 October, 2019 - 2:04 pm

    <p>Are there any Android camera apps that DO give you onscreen buttons for each lens? </p>

    • Paul Thurrott

      Premium Member
      23 October, 2019 - 3:03 pm

      <blockquote><em><a href="#482978">In reply to dsilverman:</a></em></blockquote><p>Yes, OnePlus and Samsung do this.</p>

    • jgraebner

      Premium Member
      23 October, 2019 - 6:06 pm

      <blockquote><em><a href="#482978">In reply to dsilverman:</a></em></blockquote><p>I have a hunch that it won' t be long before there are 3rd-party camera apps for the Pixel 4 that offer manual control of the lenses for those that want it. </p>

  • Andi

    23 October, 2019 - 2:19 pm

    <p>Another difference of approach between Apple and Google; the Pixel's photos retain maximum of detail possible at the risk of underexposing and that leaves you ample room to play around in lightroom. The iphone however is squeezing every single detail to give you the best possible result on phone. This usually overexposes the shot and leaves very little detail to work during potential editing.</p><p><br></p><p><br></p><p><br></p>

    • lvthunder

      Premium Member
      23 October, 2019 - 4:16 pm

      <blockquote><em><a href="#482980">In reply to Andi:</a></em></blockquote><p>So use Lightroom and capture a RAW if you want to play with it in Lightroom anyways.</p>

  • lvthunder

    Premium Member
    23 October, 2019 - 2:21 pm

    <p>Google, the real innovator in this field</p><p><br></p><p>Wow what a fan boy comment if I've ever seen one. I guess the people working on this at other companies are just pretenders.</p><p><br></p><p>Personally I like having the button to switch easily. With photography unless your photographing a landscape it's all about capturing the moment that ends in a flash. It's a whole lot quicker to press a button and get to 2x then it is to pinch to get there. I hate it when it takes people 10 seconds to take one picture. By then the moment most likely is gone or the group you are shooting is bored and aren't showing their best expressions.</p>

    • jwpear

      Premium Member
      23 October, 2019 - 2:36 pm

      <blockquote><em><a href="#482983">In reply to lvthunder:</a></em></blockquote><p>It seems more like an opinion than a fan boy comment. I suspect its more nuanced–Google will be better at some scenes and other OEMs will be better at others. Is Google better more often? Maybe. Does it really matter if you already have a device you like? Probably not.</p><p><br></p><p>I think the main point Paul is trying to make is that the user doesn't have to think about which lens they need to use to take a shot. They just frame it and then Google figures out how to best capture that shot. It's hard to argue that that isn't the right approach for most users. I can see that some will want manual control at times. That seems fine too. Wouldn't it be nice if we had both options?</p>

    • the_real_entheos

      23 October, 2019 - 2:50 pm

      <blockquote>You must be that guy who buys primo seats behind the bench of the visiting team (or home team) and yells insults.</blockquote><blockquote><a href="#482983"><em>In reply to lvthunder:</em></a></blockquote><p><br></p>

      • lvthunder

        Premium Member
        23 October, 2019 - 4:15 pm

        <blockquote><em><a href="#482998">In reply to the_real_entheos:</a></em></blockquote><p>No. I'm the one who's annoyed at those guys and hope they shut up.</p>

    • Paul Thurrott

      Premium Member
      23 October, 2019 - 3:02 pm

      <blockquote><em><a href="#482983">In reply to lvthunder:</a></em></blockquote><p>Sorry, who was innovating in computational photography on smartphones before Google?</p>

      • lvthunder

        Premium Member
        23 October, 2019 - 4:13 pm

        <blockquote><em><a href="#482999">In reply to paul-thurrott:</a></em></blockquote><p>That guy that Google hired. I would also say that Microsoft did a form of it with Photo Synth. I would also say HDR is a form of it as well and people have been doing that manually for years before smartphones came around.</p><p><br></p><p>You also didn't say who was doing it first. You said they were the "real" innovator. That means that the other engineers at other companies that are doing it as well aren't "real innovators". In my mind that is offensive to those engineers and unneeded in this story.</p>

        • jaredthegeek

          Premium Member
          24 October, 2019 - 6:24 pm

          <blockquote><em><a href="#483017">In reply to lvthunder:</a></em></blockquote><p>You must be tremendous fun at parties. Offensive to other engineers because a product is better?</p>

      • SvenJ

        23 October, 2019 - 5:20 pm

        <blockquote><a href="#482999"><em>In reply to paul-thurrott:</em></a><em> </em>Nokia? At the state of the art back then.</blockquote><p><br></p>

        • jaredthegeek

          Premium Member
          24 October, 2019 - 6:24 pm

          <blockquote><em><a href="#483045">In reply to SvenJ:</a></em></blockquote><p>They have not innovated anything in years. Its irrelevant.</p>

  • Daekar

    23 October, 2019 - 3:32 pm

    <p>I don't really care how good their AI is, I want the option for control, even if it's just the option to stop at the zoom level that corresponds with the optical zoom only. It's disingenuous for them to imply that the distinction doesn't matter, at least if they're going to pretend to be pro quality. That's the kind of arrogant thing that I would've expected from Apple, not Google. </p><p>Oh Google, you have spent the last few years growing to be the worst version of yourself little bit by little bit. When you're making Apple look humble and sensitive to customer needs by comparison, something has gone wrong. </p>

  • Chris_Kez

    Premium Member
    23 October, 2019 - 3:49 pm

    <p>Thanks for sharing Paul, I don’t think I’ve seen anyone else lay this out quite the way you did. ??</p>

  • SyncMe

    23 October, 2019 - 6:00 pm

    <p>Apple has always given you the ability to pinch to zoom. Apple has also been doing computational photography for at least as long as Google. Go back to many of the Apple keynotes and watch the camera technology details. All of them mention the computational photography taking place inside the phone. Google just started doing this on the phone. </p>

    • CaymanDreamin

      Premium Member
      24 October, 2019 - 9:57 am

      <blockquote><a href="#483057"><em>In reply to SyncMe:</em></a><em> You need to read the article again. He didn't say Google is the only company doing computational photography, he's saying they do it best. He didn't say Google is the only one with pinch to zoom, he said you don't have an option of a button to use only the zoom lens. You pinch to zoom and a combination of the second lens and AI does the rest.</em></blockquote><p><br></p>

  • IanYates82

    Premium Member
    23 October, 2019 - 6:30 pm

    <p>Nice description of the functionality.</p><p><br></p><p>One thing though… The galaxy S10, whilst offering buttons to choose between the lens modes, does also switch to the telephoto lens once you pinch/zoom past 2x. It kinda snaps at that point whilst switching lens and then proceeds onwards with the zoom level </p>

  • wright_is

    Premium Member
    24 October, 2019 - 4:02 am

    <p>Interesting, now that you say that, I see that there is a zoom button on my Mate 10 Pro, but I've never used it, I've always used pinch to zoom… Likewise, I'd never think about not framing the image and cropping afterwards. The cropping is only used if the sensor/lens can't get in tight enough to make the shot.</p><p>That is probably because I come from traditional film photography and you either swap out the lens to get the telephoto you need or you have a smooth-zoom lens that doesn't "click" to specific zoom levels, it has an analogue twist to zoom, you just look through the viewfinder until you have the image you want framed.</p>

  • codymesh

    24 October, 2019 - 5:22 am

    <p>it's unfortunate that the best camera on a phone is paired with virtually the worst battery life of any flagship</p>

  • eric_rasmussen

    Premium Member
    24 October, 2019 - 11:33 am

    <p>The Pixel 3a I have takes amazing shots for a single sensor. It's nearly as good as the current generation of multi-sensor devices. I would expect nothing less than amazing photo abilities from Google in the Pixel 4; the team they have working on photography is easily the best in the industry.</p><p><br></p><p>It's interesting to me that they specifically call it multi-frame. You can take a series of photographs in rapid sequence and use that to compose a highly-detailed composite image, but I didn't think anyone was doing that outside of the DSLR space and special silicon. It makes me wonder if they were already doing that in the Pixel 3 since the photos are so good.</p>

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC