Thanks to the release this week of iOS 10.1 Beta 1, I was able to test an early version of Portrait mode, a coming feature of the iOS Camera app that will only work with the iPhone 7 Plus. And so far, it’s not all that impressive.
I say that as someone who first experienced the true power of incredible smartphone optics with the Lumia 1020, back in 2013. That summer, I took the Lumia 1020 to Amsterdam—and various spots in The Netherlands and Belgium—that summer and was blown away by the quality of its photos, and especially so by the effortless bokeh effect it could produce.
The thing that was most impressive about the Lumia 1020 was that the bokeh was automatic: All you had to do was focus on an item and the camera would do the rest. It worked with far-off items, and it worked with tiny, close-up items, like this bee.
Which you could, of course, zoom in on and see great detail.
The coming Portrait mode in the Camera app on the iPhone 7 Plus does not work like that. (Note: I’ve had OK success taking pictures of small things with previous iPhones, of course.) Instead, it works more like third-party photo apps, where you can select an area in a photo and have the software artificially blur out the surrounding area. That is, it’s not “real” bokeh. It’s simulated bokeh.
It’s also limited by the fact that it only uses one of the iPhone 7 Plus’s two cameras, the 12 MP “telephoto” camera with its ƒ/2.8 aperture, to take the photo. OK, yes, the other camera is actually used as well, but only to generate the depth data that is used to artificially blur non-focused items.
This means a few things. It means that you have to be a certain distance from an item you wish to focus on. And that precludes taking photos like the bee shot above.
It means that it doesn’t work well—at all—in low light. My indoor shots have been universally grainy, like photos you’d have taken with an iPhone 3GS years ago. Look at this piece of garbage:
But even outdoor shots on sunny days—ideal conditions for smartphone cameras—aren’t ideal. Because the blur effect is software-based, you’ll often get fringing effects on the edges of focused items. Consider this shot.
That looks fine until you zoom in on the top of cup, which appears to be suffering from motion sickness.
This pattern repeats itself in all of the photos I took. Like with this plant:
Which actually looks pretty terrible before zooming, probably because there are too many things at different, but close, distances from the lens.
Ultimately, what this means is that Apple’s software tricks are just that, tricks. And while the resulting photos may be fine for, say, Instagram, where no one will be zooming into the full-sized version of the photo anyway, they’re not good enough for actual memory preservation. At least not yet.
Obviously, given my interest in smartphone photography, this is a topic I’ll keep reexamining in future iOS 10.1 betas. But so far, I’m not impressed.