This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


Matthew Panzarino’s Hands-On with the iPhone 7 Plus’ Portrait Mode in iOS 10.1 Beta

Matthew Panzarino, writing for TechCrunch, was able to test the iPhone 7 Plus’ upcoming Portrait mode, released to developers with a first beta of iOS 10.1 earlier today.

If you’ve skipped here to see how the heck it works, I don’t blame you. The short answer: incredibly, miraculously well in many instances. And pretty rough in others. Apple says this is still in beta and it is. It has trouble with leaves, with chain link fences and patterns and with motion. But it also handles things so well that I never thought possible like fine children’s hair and dog fur, shooting pictures with people facing away and objects that are not people at all.

What does it have major trouble with? Fine lines, wires, chain link, glass, leaves. Anything that merges with the edges of your subject a bunch of times could confuse it. The closer to the subject the harder it is for it to distinguish. Motion, too, is a no. If the subject moves a bit, ok. If it moves too much you get ghosting, as you do in HDR mode — because there is compositing involved.

Let’s look at some examples and I’ll dissect what works, what doesn’t and how the mode is applying the effect in each image. In each case, I’ll include both the standard and Depth Effect image for comparison.

Panzarino reports that Portrait works on non-human subjects as well (which Apple didn’t originally mention) and that it uses new body detection systems and a “sliding scale” mechanism to apply blurs for the background. Fascinating explanation – with some good points on how Apple could improve Portrait mode in the future.