Speaking of smaller features I wouldn't have expected to see at last week's WWDC, Bryan Gaz, writing for Digital Photography Review, has noticed some welcome improvements to camera import and RAW files in iOS 12:
Now, when you plug in Apple’s SD card to Lightning adapter (or camera connection kit), the Photos app will show up as an overlay on whatever app you’re using. This comes as a much less invasive method than previously used in iOS 11, wherein whatever app you were in would be switched over to the full-screen Photos app for importing. It also means you can multitask more efficiently, importing photos while getting other stuff done.
Now, when photos are detected on a card, iOS 12 will automatically sort through the content and determine if any of the photos have already been imported. If they have, they will be put in a separate area so you don’t accidentally import duplicates. Another new feature is a counter on the top of the screen that lasts you know how many photos are being displayed and how much space they take up on the memory card. This should help alleviate the guesswork involved when trying to determine whether or not you have enough storage on your iOS device.
I've never imported photos on my iPad using the Lightning to SD Card Camera Reader because I don't have a camera, but I know that the import process is one of the pain points for photographers who want to use an iPad in their workflows. The idea of having Photos show up automatically in Slide Over upon connecting an external device is interesting; it perfectly ties into the iPad's focus on drag and drop for multitasking and file transfers. It seems like this approach would work nicely for importing files from external USB devices if only Apple decided to add support for those too.
Update: After looking into this more closely, it appears that Photos only appears automatically upon connecting an SD card if it's already in Slide Over mode. This isn't as convenient as DP Review's original report, but at least all the other improvements mentioned in the story are indeed part of iOS 12.
Lovely update to Halide, my favorite third-party iPhone camera app, released today on the App Store. Among other improvements (such as an Apple Watch app and self-timer), I'm a fan of the new grid-based photo reviewer (try swiping down slowly on the grid to dismiss the view) as well as the advanced setting to prevent attaching location metadata when sharing a photo to social networks. I wish more apps offered an explicit preference like Halide does.
The focus on Accessibility in this release is also commendable:
We care deeply about Accessibility and have improved Halide with every update to make it easier to use for all users, but this update is our biggest push yet. With support for Dynamic and Bold Type throughout, VoiceOver support and many more enhancements. Even our 30 second timer option was included with Accessibility in mind, offering users with limited mobility more freedom to take photos.
That being said, we’re not done: this year we’ve worked with noted accessibility specialist Sommer Panage. She advised us on this release, and and helped set goals for accessibility in the year ahead.
Matthias Hochgatterer, in a blog post detailing the invoice scanning feature he brought to Finances for iOS with an update released today:
I’ve just recently worked on invoice scanning for Finances. It lets you scan invoices on iPhone or iPad and add them as a PDF document to transactions. In this post I will show you how I’ve implemented that feature using the frameworks available on iOS.
Let's start by looking at the final result. You can see the invoice scanning in the Finances trailer. The user interface looks very similar to the document scanning UI in Apple’s Notes app on iOS 11. That’s not a coincident. I’ve reimplemented the exact same user interface, because most iOS users are already familiar with it. Also I found it an interesting challenge to implement it myself.
I've been considering Finances (which is available both on Mac and iOS and is on sale for both platforms today) as a replacement for the system I built in Numbers last year, which isn't scaling anymore (my accountant now wants me to upload PDF receipts to a Trello board, and traditional spreadsheets do not support inline file attachments). I'm intrigued by the cross-platform nature of Finances, its double-entry bookkeeping system, and this new Notes-like scanning mode built using Vision technologies in iOS 11. I haven't seen other apps publicly advertise scanning functionalities built using Vision and the implementation in Finances looks extremely well done.
I will be playing around with Finances over the weekend (I know; usually, this isn't what I do with my weekends but I also need to keep my accountant happy). You can take a look at Finances' new trailer below.
We first reviewed Halide, the powerful third-party camera app by Ben Sandofsky and Sebastiaan de With, when it debuted in the summer of 2017, providing a powerful and elegant alternative to Apple's Camera app that fully embraced RAW photography and advanced controls in an intuitive interface. We later showcased Halide's iPhone X update as one of the most thoughtful approaches to adapting for the device's Super Retina Display; to this day, Halide is a shining example of how the iPhone X's novel form factor can aid, instead of hindering, complex app UIs.
While Halide was already regarded as an appealing alternative to Apple's stock app for professional photographers and RAW-curious iPhone users (something that designer de With covered in depth in his excellent guide), it was lacking a handful of key features of the modern iPhone photography experience. Sandofsky and de With want to eliminate some of these important gaps with today's 1.7 update, which focuses on bringing the power of Portrait mode to Halide, supporting the iPhone X's TrueDepth camera system, and extending the app's integrations via a special ARKit mode, new export options, and native integration with the popular Darkroom photo editing tool.
Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we've seen so far):
Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.
It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.
If you're a developer working on iPhone X UI updates, don't miss Sebastiaan's map visualization of the device's display.
Dan Provost of Studio Neat (makers of the excellent Glif) ran some tests to analyze the low-light performance of the iPhone X's telephoto lens:
Last year, when the iPhone 7 Plus was released, Glenn Fleishman wrote a terrific piece for Macworld about how the dual lens camera system works. In short, when you zoom-in to 2X, the camera does not always switch to the telephoto lens. In some cases (typically in low light scenarios), you will be presented with a cropped image from the wide angle lens instead. This was sacrilege to camera nerds, but Apple would argue that if the cropped image looks better in those low light situations, then that is the correct approach.
Results are impressive:
As you can see, the iPhone X required very little light before it decided to use the telephoto lens. The iPhone 7 Plus required quite a bit more. I used the app Light Meter to measure the light at each interval, which I denote in the video. The app measures the lux, which is a measure of illuminance equal to one lumen per square meter. (I measured from both devices and averaged the results, as the readings were slightly different. I wouldn’t expect an app to function as well as a true light meter, but this probably gets us in the ball park).
Make sure to check out the video to see the lens switching in action. The difference between the iPhone 7 Plus and the X is substantial when it comes to the amount of light required for the system to pick the telephoto lens.
The iPhone's camera has long been one of its most important features. Every year when new models are introduced, it's a sure bet that camera improvements are part of the package. Last year that remained true, but it also proved an even more special year for the iPhone's camera setup. The introduction of dual rear-facing cameras with Portrait mode was something different – pictures no longer just looked a little better than on older iPhone models, they looked almost professional-quality.
This year, whether you picked up a new iPhone or not, Portrait mode is a better feature than before. Part of this is due to software improvements in iOS 11, but another key benefit is that third-party developers now have access to the depth information in Portrait photos. For the first time, Portrait images taken with the iPhone can be edited and enhanced in unique ways, and Focos is a new app that takes full advantage of that opportunity.
Karan Varindani considers the potential of Apple's Clips to be a spiritual successor to Photo Booth:
With the iPad 2, back in early 2011, Apple brought Photo Booth to the iPad. I distinctly remember thinking that this was a no-brainer at the time. Growing up in Ghana, there weren’t that many Macs in my high school, but everybody that had one used Photo Booth. It was very regular to walk into the sixth form (senior year) common room and see groups of friends, myself included, behind a MacBook playing with the filters. Talking to several of my American friends, it sounds like it was the same deal here. I always thought that it was only a matter of time before Apple brought Photo Booth to the iPhone, but six years later it still just ships with Macs and iPads (and I don’t think that it’s been updated in that time).
Playing with the Selfie Scenes in Clips last week, I had the same feeling that I did playing with Photo Booth on my Mac many years ago. It was a little surreal, as someone with incredible front-camera shyness, to find myself having so much fun with it. The whole experience had me thinking: In a few years, once the Face ID technology has spread to the rest of the iOS line (and maybe even the Mac), could Clips be the successor to Photo Booth? Between Selfie Scenes, stickers, Live Titles, and fast sharing to social media, it seems the perfect fit.
I think the best modern equivalent of that Photo Booth social experience is Snapchat's lenses, which I've observed can consistently deliver laughter and interest among a group of friends or family members. While Clips' Selfie Scenes offer a similarly neat technical effect, if Apple is serious about being successful with the app, a couple big changes need to take place: the square orientation limit has to go, and Clips needs better hooks into apps like Instagram and Snapchat than the share sheet provides.
Photo Booth's prime was a very different era than where we are today, and without the aid of a true social network it will be hard for Apple to replicate its success. So far, Animoji seem much closer to meeting that goal than Clips.
In a major update released today, Apple's Clips app for iOS received a redesigned user interface, Star Wars stickers and scenes, iCloud support for syncing clips between iPhone and iPad, and a big iPhone X feature exclusive: Selfie Scenes.