John Paczkowski of BuzzFeed conducted a fascinating interview with Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Johnnie Manzari of Apple's Human Interface Team about the iPhone’s camera. Much of the discussion is focused on the new Portrait Lighting feature available in the 8 Plus and X. As Paczkowski explains,
The camera's effects don't rely on filters. They're the result of Apple's new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It's all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized.
Manzari described the extensive process that went into creating Portrait Lighting:
"We spent a lot of time shining light on people and moving them around — a lot of time," Manzari says. "We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work."
BuzzFeed’s article is worth a close read because it’s about more than just the camera in Apple’s new and upcoming iPhones. The behind-the-scenes peek at the development process of the many functions that the iPhone’s camera serves is the best example of one of Apple’s biggest competitive advantages: the fusion of hardware and software.
Today Apple released the first major update for its short form video creation app, Clips. Version 1.1 includes, most notably, a variety of animated graphics featuring beloved characters from Disney and Pixar films.
In the Disney department you can add Mickey, Minnie, Donald, and Daisy to your videos, each with their own unique animations. And from Pixar, characters originating in Toy Story, Inside Out, and Cars are available. There are also a variety of new posters available to be used as title cards, some of which were designed by Disney and others by Apple. A selection of the new posters feature vibrant animations when you use them, such as water rippling in a pool.
In addition to the new content available for creating videos, Apple has also refined some design aspects in the app to make it easier to use. For example, Live Titles could always be edited by tapping on the text, but that wasn’t a very discoverable interface. Now there’s a new button to accomplish the task.
Apple’s press release announcing the update mentions that Clips “is included on all new iOS devices,” which should help bolster adoption of the app. That press release also features a video seemingly created in Clips that’s worth checking out.
Today Microsoft introduced a new app exclusively for iPhone, Seeing AI. This app is designed as a tool for the low vision community; using the iPhone’s camera and its AI smarts, Seeing AI converts the visual experience of the world into an audible one. As you point the camera at things in the world around you, the app will describe that world in a quick, informative manner.
From a user’s perspective, the app is tremendously simple to use; there’s very little that needs to be done before Seeing AI can begin describing the space around you. If you want to identify people, you can first set them up as recognizable from the sidebar menu’s ‘Face Recognition’ option. Otherwise, all you have to do to start identifying things is select from one of five different categories (the app calls them ‘channels’) to help the app understand what type of object it needs to identify. The five current categories are:
Scene (currently tagged as ‘Beta’)
Microsoft says a category for currency will be coming soon, allowing the app to intelligently identify different denominations of cash.
In my testing of the app, it’s far from perfect in its ability to identify things, but it has done a solid job all-around. Though the tech driving the app may only be experimental and have a long way to go, the app is far from barebones in what it can do now. When identifying a document, Seeing AI will audibly guide you through the capture process to help you get the full document in view. After scanning a product’s barcode, in some cases you’ll receive additional information about the product beyond just its name. And if the app is scanning a person, it can even describe a best guess at their visible emotional state. It’s an impressive, deep experience that nevertheless remains dead simple to operate.
Even if you aren’t in the market for Seeing AI yourself, it’s a fascinating product worth checking out, and it’s entirely free. You can download it on the App Store.
Microsoft has a short introductory video that gives a great taste of all that the app can do, embedded below.
Apple describes Clips as an app "for making and sharing fun videos with text, effects, graphics, and more." Essentially it's a stripped-down version of a video editor like iMovie, optimized to make edits fast and user-friendly on mobile. Its key focus is allowing you to shoot seconds-long clips and string them together into a video worth sharing.
Apple released a new ad in its ‘practically magic’ series featuring the iPhone 7 Plus Camera app's Portrait mode called ‘Take Mine.’ Set in Greece, the video starts with a young woman arriving to visit her grandmother. They sit in a cafe where the young woman takes a photo of her grandmother using Portrait mode, which simulates a shallow depth of field.
The grandmother gasps when she sees the photograph exclaiming ‘What a great photo!’ which gets the attention of others nearby. That leads to the young woman being asked to take portraits of people all over her grandmother’s village. The action cuts between the woman taking photos and the portraits she takes, ending with the tagline ‘Portrait mode on the iPhone 7 Plus.’
Editing RAW files feels like a huge leap forward in terms of mobile photography: With iOS 10, the iPhone is evolving from a great camera for taking casual photos with into a capable professional tool. It still has plenty of limitations, but I suspect we've passed a tipping point.
But shooting while out and about is one thing. What about using the iPhone in a studio? I gathered together a couple of friends to do a little impromptu photoshoot to see how the iPhone would hold up.
Ben is the developer of Obscura, which I featured in my review yesterday because of its native RAW support on iOS 10. He makes some good points on the limitations and advantages of shooting RAW on iPhone.
I consider the iPhone a computer with a camera more than a computer that makes phone calls. Therefore, Austin Mann's annual iPhone camera review is my favorite of the bunch. I've been linking them for the past couple of years, and I find Austin's approach always fascinating and well-presented.
As many of you know, in the past I've created this review in Iceland twice, Patagonia and Switzerland, but this year I wanted to really change things up. With indicators pointing toward possibilities of optical zoom I asked myself: where's the best place in the world to test optical zoom? Africa of course.
So this year, in collaboration with Nat Geo Travel + Nat Geo Adventure we’ve set out to get you the answers. I'm writing you from deep in the Nyungwe rain forest in southwest Rwanda… we've been tracking gorillas in the north, boating Lake Kivu in the west and running through tea plantations in the south… all with the iPhone 7 Plus in hand.
I've had a blast playing with the wide spectrum of new features and tech but as always, our mission is to find out the answer to one question:
How does all this new tech make my pictures (and videos) better than before?
The result is beautiful. The video "review" is a mini-documentary/short film about tracking down mountain gorillas, and it's 9 minutes long. Seeing how they found the gorillas brought a big smile on my face, and you can notice how the zoom interface of the iPhone 7 Plus was useful for that purpose.
Watch it below, and go check out Austin's photos and summary of the experience here.
Fun new camera app for iPhone by William Wilkinson and Deepak Mantena: Slide lets you create animated GIFs with a 3D effect that you can share with others. The 3D effect, as delightfully explained in the app's weird promo video, is achieved by sliding the iPhone quickly to the side after selecting a subject.
I bought the app, and it works as advertised. Slide doesn't work with the front-facing camera because it requires high frame rate to build the 3D animation, which, depending on what you're looking at, may produce odd and funny results. I'm sending pictures of 3D furniture to my parents right now and they're asking what's going on.
In the past eight years, each new advancement in iPhone camera technology has made dramatic improvements to image quality. The new 12-megapixel iPhone 6s iSight camera is no exception. With 50% more megapixels than the last four iPhone 8-megapixel models, the iPhone 6s boasts a number of key improvements including: improved auto-focus, local tone-mapping, noise reduction, and colour separation, with that fancy “deep trench isolation” technology Apple is raving about.
In this follow-up post to my previous iPhone comparisons, I present a 9 iPhone comparison from all iPhone versions taken with Camera+ including: the original iPhone, iPhone 3G, iPhone 3Gs, iPhone 4, iPhone 4s, iPhone 5, iPhone 5s, iPhone 6, and the new iPhone 6s, in a variety of real-life situations to test each iPhone camera’s capabilities.