In addition to the early slate of iPhone reviews from the press, it's become tradition in recent years for each iPhone to be graded as a camera by professional photographer Austin Mann. I especially enjoyed Mann's review this year of the iPhone XS camera system. He writes:
Most of the time my expectations for camera upgrades on “S” years aren’t so high, but after shooting with the iPhone XS for a week, I can confidently say it’s a huge camera upgrade. There’s a lot of little improvements, but Smart HDR definitely takes the cake. This is a feature and technology that improves virtually everything you capture with your iPhone camera. I think you’ll be really thrilled when you experience the results yourself.
As I shared in last week's issue of MacStories Weekly for Club MacStories, the iPhone XS and XR announcements caught me by surprise in that I expected there to be more change in the devices compared to last year's iPhone X. I've ordered a XS Max, but the primary reason for my upgrade was the additional screen real estate compared to my X; bigger display aside, September's keynote didn't provide much of a compelling reason for me to purchase a new phone this year. However, Mann's review and that of John Gruber have helped provide much-needed additional detail on the camera upgrades in the XS, which sound impressively significant.
One of the standout lines in Mann's review for me comes near the beginning, where he says, "iPhone XS captures what your eyes see." It's hard to find higher praise than that.
Apple describes the XS as sporting “dual 12MP wide-angle and telephoto cameras”. This will be obvious to most of you, but in case it’s not, they’re not just dual rear-facing lenses, they’re dual rear-facing cameras. The wide-angle and telephoto lenses each have their own sensors. As a user you don’t have to know this, and should never notice it. The iPhone XS telephoto camera is the same as in the iPhone X — same lens, same sensor.
But the iPhone XS wide-angle camera has a new lens, which I believe to be superior to last year’s, and an amazing new sensor which is remarkably better than last year’s. And last year’s was very good.
Anytime an iPhone review gets too technical about camera details and photography lexicon, I tend to gloss over it and move on. I'm not a camera expert and I usually don't care about the nitty-gritty. But John Gruber's analysis of the iPhone XS' camera stack, A12 SoC, and seemingly unadvertised improved sensor is one of the most interesting camera-focused iPhone reviews I've read in years. I don't want to spoil it – move past the photos at the beginning and keep reading.
Speaking of smaller features I wouldn't have expected to see at last week's WWDC, Bryan Gaz, writing for Digital Photography Review, has noticed some welcome improvements to camera import and RAW files in iOS 12:
Now, when you plug in Apple’s SD card to Lightning adapter (or camera connection kit), the Photos app will show up as an overlay on whatever app you’re using. This comes as a much less invasive method than previously used in iOS 11, wherein whatever app you were in would be switched over to the full-screen Photos app for importing. It also means you can multitask more efficiently, importing photos while getting other stuff done.
Now, when photos are detected on a card, iOS 12 will automatically sort through the content and determine if any of the photos have already been imported. If they have, they will be put in a separate area so you don’t accidentally import duplicates. Another new feature is a counter on the top of the screen that lasts you know how many photos are being displayed and how much space they take up on the memory card. This should help alleviate the guesswork involved when trying to determine whether or not you have enough storage on your iOS device.
I've never imported photos on my iPad using the Lightning to SD Card Camera Reader because I don't have a camera, but I know that the import process is one of the pain points for photographers who want to use an iPad in their workflows. The idea of having Photos show up automatically in Slide Over upon connecting an external device is interesting; it perfectly ties into the iPad's focus on drag and drop for multitasking and file transfers. It seems like this approach would work nicely for importing files from external USB devices if only Apple decided to add support for those too.
Update: After looking into this more closely, it appears that Photos only appears automatically upon connecting an SD card if it's already in Slide Over mode. This isn't as convenient as DP Review's original report, but at least all the other improvements mentioned in the story are indeed part of iOS 12.
Lovely update to Halide, my favorite third-party iPhone camera app, released today on the App Store. Among other improvements (such as an Apple Watch app and self-timer), I'm a fan of the new grid-based photo reviewer (try swiping down slowly on the grid to dismiss the view) as well as the advanced setting to prevent attaching location metadata when sharing a photo to social networks. I wish more apps offered an explicit preference like Halide does.
The focus on Accessibility in this release is also commendable:
We care deeply about Accessibility and have improved Halide with every update to make it easier to use for all users, but this update is our biggest push yet. With support for Dynamic and Bold Type throughout, VoiceOver support and many more enhancements. Even our 30 second timer option was included with Accessibility in mind, offering users with limited mobility more freedom to take photos.
That being said, we’re not done: this year we’ve worked with noted accessibility specialist Sommer Panage. She advised us on this release, and and helped set goals for accessibility in the year ahead.
Matthias Hochgatterer, in a blog post detailing the invoice scanning feature he brought to Finances for iOS with an update released today:
I’ve just recently worked on invoice scanning for Finances. It lets you scan invoices on iPhone or iPad and add them as a PDF document to transactions. In this post I will show you how I’ve implemented that feature using the frameworks available on iOS.
Let's start by looking at the final result. You can see the invoice scanning in the Finances trailer. The user interface looks very similar to the document scanning UI in Apple’s Notes app on iOS 11. That’s not a coincident. I’ve reimplemented the exact same user interface, because most iOS users are already familiar with it. Also I found it an interesting challenge to implement it myself.
I've been considering Finances (which is available both on Mac and iOS and is on sale for both platforms today) as a replacement for the system I built in Numbers last year, which isn't scaling anymore (my accountant now wants me to upload PDF receipts to a Trello board, and traditional spreadsheets do not support inline file attachments). I'm intrigued by the cross-platform nature of Finances, its double-entry bookkeeping system, and this new Notes-like scanning mode built using Vision technologies in iOS 11. I haven't seen other apps publicly advertise scanning functionalities built using Vision and the implementation in Finances looks extremely well done.
I will be playing around with Finances over the weekend (I know; usually, this isn't what I do with my weekends but I also need to keep my accountant happy). You can take a look at Finances' new trailer below.
We first reviewed Halide, the powerful third-party camera app by Ben Sandofsky and Sebastiaan de With, when it debuted in the summer of 2017, providing a powerful and elegant alternative to Apple's Camera app that fully embraced RAW photography and advanced controls in an intuitive interface. We later showcased Halide's iPhone X update as one of the most thoughtful approaches to adapting for the device's Super Retina Display; to this day, Halide is a shining example of how the iPhone X's novel form factor can aid, instead of hindering, complex app UIs.
While Halide was already regarded as an appealing alternative to Apple's stock app for professional photographers and RAW-curious iPhone users (something that designer de With covered in depth in his excellent guide), it was lacking a handful of key features of the modern iPhone photography experience. Sandofsky and de With want to eliminate some of these important gaps with today's 1.7 update, which focuses on bringing the power of Portrait mode to Halide, supporting the iPhone X's TrueDepth camera system, and extending the app's integrations via a special ARKit mode, new export options, and native integration with the popular Darkroom photo editing tool.
Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we've seen so far):
Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.
It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.
If you're a developer working on iPhone X UI updates, don't miss Sebastiaan's map visualization of the device's display.
Dan Provost of Studio Neat (makers of the excellent Glif) ran some tests to analyze the low-light performance of the iPhone X's telephoto lens:
Last year, when the iPhone 7 Plus was released, Glenn Fleishman wrote a terrific piece for Macworld about how the dual lens camera system works. In short, when you zoom-in to 2X, the camera does not always switch to the telephoto lens. In some cases (typically in low light scenarios), you will be presented with a cropped image from the wide angle lens instead. This was sacrilege to camera nerds, but Apple would argue that if the cropped image looks better in those low light situations, then that is the correct approach.
Results are impressive:
As you can see, the iPhone X required very little light before it decided to use the telephoto lens. The iPhone 7 Plus required quite a bit more. I used the app Light Meter to measure the light at each interval, which I denote in the video. The app measures the lux, which is a measure of illuminance equal to one lumen per square meter. (I measured from both devices and averaged the results, as the readings were slightly different. I wouldn’t expect an app to function as well as a true light meter, but this probably gets us in the ball park).
Make sure to check out the video to see the lens switching in action. The difference between the iPhone 7 Plus and the X is substantial when it comes to the amount of light required for the system to pick the telephoto lens.
The iPhone's camera has long been one of its most important features. Every year when new models are introduced, it's a sure bet that camera improvements are part of the package. Last year that remained true, but it also proved an even more special year for the iPhone's camera setup. The introduction of dual rear-facing cameras with Portrait mode was something different – pictures no longer just looked a little better than on older iPhone models, they looked almost professional-quality.
This year, whether you picked up a new iPhone or not, Portrait mode is a better feature than before. Part of this is due to software improvements in iOS 11, but another key benefit is that third-party developers now have access to the depth information in Portrait photos. For the first time, Portrait images taken with the iPhone can be edited and enhanced in unique ways, and Focos is a new app that takes full advantage of that opportunity.