This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


Halide Developer Ben Sandofsky Breaks Down How the iPhone XR Captures Depth Data

Ben Sandofsky from the team that makes the Halide iOS camera app has a detailed post on the iPhone XR’s camera and how Apple creates Portrait Mode photos with a single lens. Sandofsky walks through how Apple uses Focus Pixels to develop a rough Disparity Map and combines that with a Portrait Effects Matte to create Portrait Mode images.

The results have some advantages, but also distinct disadvantages compared to the iPhone XS’s camera. As Sandofsky explains:

It seems the iPhone XR has two advantages over the iPhone XS: it can capture wider angle depth photos, and because the wide-angle lens collects more light, the photos will come out better in low light and have less noise.

However:

…most of the time, the XS will probably produce a better result. The higher fidelity depth map, combined with a focal length that’s better suited for portraiture means people will just look better, even if the image is sometimes a bit darker. And it can apply Portrait effects on just about anything, not just people.

Although Apple’s Camera app can only take Portrait Mode photos of people on the iPhone XR, the upcoming Halide 1.11 update will combine the XR’s Disparity Map and Halide’s own blur effect to apply a similar effect beyond human subjects. Sandofsky admits that the feature isn’t perfect due to the low quality of the Disparity Map created by the XR, but the photos included in his post show that it can take excellent pictures under some conditions.

It’s remarkable what is being done to squeeze depth information out of the XR’s single lens and instructive to understand how the underlying technology works. It’s also apparent that Apple has made significant advancements since the introduction of the first dual-lens cameras.