Apple has released the first big update in over a year for its Clips video creation tool. Following the trend begun in iOS 12, which added Animoji support to FaceTime, now all Animoji and Memoji characters can also be used inside Clips. Though I would have expected such an update a year ago, it’s nevertheless good to see. Besides Animoji and Memoji, Clips 2.1 only adds a couple other small new features, like a fresh batch of Mickey and Minnie stickers, a ‘Let It Snow’ winter poster, and support for right-to-left languages. After spending some time with the update, there are a couple nice implementation details related to Animoji that deserve highlighting.
Posts tagged with "camera"
Sebastiaan de With, part of the team behind the camera app Halide has published part 1 of a multi-part breakdown of the iPhone 11 camera. It’s a fantastic analysis of what makes the new camera different from past versions and goes into great depth while remaining accessible, even if you have only a passing familiarity with photography.
To put this year’s camera into perspective, de With recaps what Apple did with last year’s iPhone cameras explaining how Smart HDR works and its shortcomings. The iPhone 11 features Smart HDR too, but as de With explains, Apple has significantly improved how it handles the dynamic range of an image.
Another aspect of the improvement is in the camera sensor hardware. Despite its diminutive size, the iPhone 11’s image sensor can resolve more detail than any iPhone camera before it.
However, many of the iPhone 11’s camera improvements come down to better software. The new camera post-processes each component of an image differently, applying different noise reduction to the sky, a face, hair, and clothing, for example. Apple calls the feature Photo Segmentation, and it’s aided by machine learning.
One of my favorite features of the new camera is Night Mode. As de With notes:
In the iPhone 11 Night Mode, you can also see detail vanish in some areas. Except that it really seems to only affect parts of the image that you don’t really care that much about. Night Mode has a remarkable if not uncanny ability to extract an image that is sometimes even sharper than the regular mode, with strong sharpening and detail retention occurring in areas that are selected by the camera during processing.
The iPhone 11’s camera is also the first one de With thinks rivals standalone cameras:
In the past, iPhones made great photos for sharing on social media, but blown up on a big screen, the shots didn’t hold up. It’s why I frequently still pack a ‘big’ camera with me on trips.
With these huge improvements in processing, the iPhone 11 is the first iPhone that legitimately challenges a dedicated camera.
There are many more details in de With’s article, including a close look at the iPhone 11’s ultra wide lens. Every section of the post has photos and side-by-side comparisons that illustrate the analysis too, which makes the full post a must-read].
According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.
Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:
The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.
This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.
Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”
There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.
I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.
Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.
Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.
From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.
You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.
With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.
If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:
When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.
The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.
The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.
Every year I look forward to Austin Mann taking the latest iPhones through their paces somewhere in the world. This year, Mann is on tour with cellist Yo-Yo Ma in China where he went out into the countryside to capture some stunning portraits and landscapes.
Mann’s review covers the new Ultra Wide lens, Night Mode, Smart HDR improvements, and ability to capture outside the frame, along with wishes for additional improvements. Mann’s take on Night Mode:
As long as I can remember, the top question I’ve received from iPhone photographers, beginners and pros alike, is How can I shoot better pictures in low light? This year’s addition of Night mode is the answer to the question. It’s easy to use, crazy powerful, and because it’s automatic it will completely change how everyone shoots on their iPhone.
Mann confirms what seemed to be the case from the photos that Apple showed off last week at its event in Cupertino – Apple has implemented Night Mode in a way that doesn’t try to turn night into day:
One thing I love about Apple’s approach to Night mode is the strategic balance of solving a technical problem while also caring deeply about artistic expression. When you look at the image above, it’s clear their team didn’t take the let’s-make-night-look-like-day approach, as some of their competitors have. Instead, it feels more like an embrace of what it actually is (night) while asking, “How do we capture the feel of this scene in a beautiful way?”
How Apple accomplishes Night Mode is interesting. As Mann explains:
From what I understand, the way Night mode actually works is the camera captures a bunch of short exposures and slightly longer exposures, checks them for sharpness, throws out the bad ones and blends the good ones. On a traditional dSLR/mirrorless camera, a 5 second exposure is one single, continuous recording of the light throughout the duration of the shutter so any movement (of subject or camera) is recorded.
But with iPhone 11 Pro the rules are different… it’s not capturing one single continuous frame but blending a whole bunch of shots with variable lengths (some shorter exposures to freeze motion and longer shots to expose the shadows.) This means the subject can actually move during your exposure but still remain sharp.
If you’ve been wondering about the new Ultra Wide camera on the new iPhones or the other new features of the camera app, be sure to check out Austin Mann’s full review for great technical and artistic insights about what Apple has accomplished with its new cameras as well as some absolutely fantastic examples of what they can do.
Spectre is a new specialized camera app from the team that created Halide, one of our favorite camera apps on iOS. The Halide team describes Spectre as a computational shutter for the iPhone, which allows the app to do things like remove people from a crowded scene, create artistic images of rushing water, and produce light trails at night. The same sort of images can be created using traditional cameras, but getting the exposure right, holding the camera absolutely still, and accounting for other factors make them difficult to get right. With Spectre, artificial intelligence is used to simplify the process and make long-exposure photography accessible to anyone with an iPhone.
On launch day for the new iPad Pros, which feature the iPhone’s TrueDepth camera system for the first time, Apple has upgraded its Clips video app with new features that take advantage of TrueDepth’s power. Today’s update also brings new camera filters, posters, stickers, and soundtrack options.
If you have an iPhone X, XS, or XR, or one of the new iPad Pros, the highlight of this release is a batch of six new Selfie Scenes. Last year when the iPhone X launched, Clips debuted Selfie Scenes as a fun and impressive way to utilize the new device’s TrueDepth camera system. Selfie Scenes isolate you from your environment and replace your surroundings with interesting digital backgrounds, such as the Millennium Falcon from Star Wars, or an animated world. The new scenes added in today’s update are called Clouds, High Noon, Big Backyard, Monster Lab, Animal Forest, and Municiberg Mayhem, a scene from The Incredibles 2. They’re all a lot of fun, providing different moods through sound effects and music. Apple says that Selfie Scenes perform better than ever on recent devices thanks to the A12 Bionic’s Neural Engine, and in my tests I did notice that the scenes were smoother and more responsive than before.
Filters and soundtracks are Clips’ next most substantial upgrades. There are three new filters: Comic Mono, Watercolor Mono, and Aged Film. Of these, the latter is easily my favorite, as the first two are only monochrome versions of existing filters. On the soundtrack side, there are a whopping 17 new tunes to choose from for your videos.
Every major Clips update adds a host of new posters, stickers, and text labels, and today’s is no exception. Continuing Apple’s partnership with Disney, there are poster options from Coco and The Incredibles 2, as well as designs related to sports, science, and more.
Though the app’s development cycle has slowed, Apple continues to plug away making Clips a great tool for short video creation. My biggest wish for the app – non-square video – has still gone unfulfilled, but hopefully one day we’ll get that change. Until then, the Selfie Scenes are a fun demo of the newest iPhones and iPad Pros, and Clips continues to be the most enjoyable video creation tool I’ve ever used. I think Apple’s on to something here, but the square video restriction continues to hold Clips back.
Today following its Brooklyn keynote event, Apple released iOS 12.1, the first major update since September’s iOS 12 brought Shortcuts, Screen Time, and more. Version 12.1 adds over 70 new emoji, introduces Group FaceTime with up to 32 participants, and lastly 2018’s iPhones get upgrades via camera improvements and dual SIM support.
Ben Sandofsky from the team that makes the Halide iOS camera app has a detailed post on the iPhone XR’s camera and how Apple creates Portrait Mode photos with a single lens. Sandofsky walks through how Apple uses Focus Pixels to develop a rough Disparity Map and combines that with a Portrait Effects Matte to create Portrait Mode images.
The results have some advantages, but also distinct disadvantages compared to the iPhone XS’s camera. As Sandofsky explains:
It seems the iPhone XR has two advantages over the iPhone XS: it can capture wider angle depth photos, and because the wide-angle lens collects more light, the photos will come out better in low light and have less noise.
…most of the time, the XS will probably produce a better result. The higher fidelity depth map, combined with a focal length that’s better suited for portraiture means people will just look better, even if the image is sometimes a bit darker. And it can apply Portrait effects on just about anything, not just people.
Although Apple’s Camera app can only take Portrait Mode photos of people on the iPhone XR, the upcoming Halide 1.11 update will combine the XR’s Disparity Map and Halide’s own blur effect to apply a similar effect beyond human subjects. Sandofsky admits that the feature isn’t perfect due to the low quality of the Disparity Map created by the XR, but the photos included in his post show that it can take excellent pictures under some conditions.
It’s remarkable what is being done to squeeze depth information out of the XR’s single lens and instructive to understand how the underlying technology works. It’s also apparent that Apple has made significant advancements since the introduction of the first dual-lens cameras.