First Hands-On with iPhone 7 Plus Portrait Mode Disappoints

First Hands-On with iPhone 7 Plus Portrait Mode Disappoints

Thanks to the release this week of iOS 10.1 Beta 1, I was able to test an early version of Portrait mode, a coming feature of the iOS Camera app that will only work with the iPhone 7 Plus. And so far, it’s not all that impressive.

I say that as someone who first experienced the true power of incredible smartphone optics with the Lumia 1020, back in 2013. That summer, I took the Lumia 1020 to Amsterdam—and various spots in The Netherlands and Belgium—that summer and was blown away by the quality of its photos, and especially so by the effortless bokeh effect it could produce.

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

The thing that was most impressive about the Lumia 1020 was that the bokeh was automatic: All you had to do was focus on an item and the camera would do the rest. It worked with far-off items, and it worked with tiny, close-up items, like this bee.

The Lumia 1020 did bokeh right…

Which you could, of course, zoom in on and see great detail.

... and it withstood closer examination.
… and it withstood closer examination.

The coming Portrait mode in the Camera app on the iPhone 7 Plus does not work like that. (Note: I’ve had OK success taking pictures of small things with previous iPhones, of course.) Instead, it works more like third-party photo apps, where you can select an area in a photo and have the software artificially blur out the surrounding area. That is, it’s not “real” bokeh. It’s simulated bokeh.

It’s also limited by the fact that it only uses one of the iPhone 7 Plus’s two cameras, the 12 MP “telephoto” camera with its Æ’/2.8 aperture, to take the photo. OK, yes, the other camera is actually used as well, but only to generate the depth data that is used to artificially blur non-focused items.

This means a few things. It means that you have to be a certain distance from an item you wish to focus on. And that precludes taking photos like the bee shot above.

It means that it doesn’t work well—at all—in low light. My indoor shots have been universally grainy, like photos you’d have taken with an iPhone 3GS years ago. Look at this piece of garbage:

blur

But even outdoor shots on sunny days—ideal conditions for smartphone cameras—aren’t ideal. Because the blur effect is software-based, you’ll often get fringing effects on the edges of focused items. Consider this shot.

cup

That looks fine until you zoom in on the top of cup, which appears to be suffering from motion sickness.

cup-zoomed

This pattern repeats itself in all of the photos I took. Like with this plant:

plant

Which actually looks pretty terrible before zooming, probably because there are too many things at different, but close, distances from the lens.

plant-zoomed

Yuck

Ultimately, what this means is that Apple’s software tricks are just that, tricks. And while the resulting photos may be fine for, say, Instagram, where no one will be zooming into the full-sized version of the photo anyway, they’re not good enough for actual memory preservation. At least not yet.

Obviously, given my interest in smartphone photography, this is a topic I’ll keep reexamining in future iOS 10.1 betas. But so far, I’m not impressed.

 

Tagged with

Share post

Please check our Community Guidelines before commenting

Conversation

There are no conversations

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC