This week's sponsor

HazeOver

Distraction Dimmer™ for Mac


Posts tagged with "camera"

Halide 1.8

Lovely update to Halide, my favorite third-party iPhone camera app, released today on the App Store. Among other improvements (such as an Apple Watch app and self-timer), I'm a fan of the new grid-based photo reviewer (try swiping down slowly on the grid to dismiss the view) as well as the advanced setting to prevent attaching location metadata when sharing a photo to social networks. I wish more apps offered an explicit preference like Halide does.

The focus on Accessibility in this release is also commendable:

We care deeply about Accessibility and have improved Halide with every update to make it easier to use for all users, but this update is our biggest push yet. With support for Dynamic and Bold Type throughout, VoiceOver support and many more enhancements. Even our 30 second timer option was included with Accessibility in mind, offering users with limited mobility more freedom to take photos.

That being said, we’re not done: this year we’ve worked with noted accessibility specialist Sommer Panage. She advised us on this release, and and helped set goals for accessibility in the year ahead.

Permalink

Finances for iOS Adds Invoice Scanning Powered by iOS 11’s Vision Framework

Matthias Hochgatterer, in a blog post detailing the invoice scanning feature he brought to Finances for iOS with an update released today:

I’ve just recently worked on invoice scanning for Finances. It lets you scan invoices on iPhone or iPad and add them as a PDF document to transactions. In this post I will show you how I’ve implemented that feature using the frameworks available on iOS.

Let's start by looking at the final result. You can see the invoice scanning in the Finances trailer. The user interface looks very similar to the document scanning UI in Apple’s Notes app on iOS 11. That’s not a coincident. I’ve reimplemented the exact same user interface, because most iOS users are already familiar with it. Also I found it an interesting challenge to implement it myself.

I've been considering Finances (which is available both on Mac and iOS and is on sale for both platforms today) as a replacement for the system I built in Numbers last year, which isn't scaling anymore (my accountant now wants me to upload PDF receipts to a Trello board, and traditional spreadsheets do not support inline file attachments). I'm intrigued by the cross-platform nature of Finances, its double-entry bookkeeping system, and this new Notes-like scanning mode built using Vision technologies in iOS 11. I haven't seen other apps publicly advertise scanning functionalities built using Vision and the implementation in Finances looks extremely well done.

I will be playing around with Finances over the weekend (I know; usually, this isn't what I do with my weekends but I also need to keep my accountant happy). You can take a look at Finances' new trailer below.

Permalink

Halide 1.7 Brings New Depth Photography and ARKit Features, Darkroom Integration

We first reviewed Halide, the powerful third-party camera app by Ben Sandofsky and Sebastiaan de With, when it debuted in the summer of 2017, providing a powerful and elegant alternative to Apple's Camera app that fully embraced RAW photography and advanced controls in an intuitive interface. We later showcased Halide's iPhone X update as one of the most thoughtful approaches to adapting for the device's Super Retina Display; to this day, Halide is a shining example of how the iPhone X's novel form factor can aid, instead of hindering, complex app UIs.

While Halide was already regarded as an appealing alternative to Apple's stock app for professional photographers and RAW-curious iPhone users (something that designer de With covered in depth in his excellent guide), it was lacking a handful of key features of the modern iPhone photography experience. Sandofsky and de With want to eliminate some of these important gaps with today's 1.7 update, which focuses on bringing the power of Portrait mode to Halide, supporting the iPhone X's TrueDepth camera system, and extending the app's integrations via a special ARKit mode, new export options, and native integration with the popular Darkroom photo editing tool.

Read more


How to Design for iPhone X (Without an iPhone X)

Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we've seen so far):

Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.

It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.

If you're a developer working on iPhone X UI updates, don't miss Sebastiaan's map visualization of the device's display.

Permalink

Dual Lens Switching on iPhone X

Dan Provost of Studio Neat (makers of the excellent Glif) ran some tests to analyze the low-light performance of the iPhone X's telephoto lens:

Last year, when the iPhone 7 Plus was released, Glenn Fleishman wrote a terrific piece for Macworld about how the dual lens camera system works. In short, when you zoom-in to 2X, the camera does not always switch to the telephoto lens. In some cases (typically in low light scenarios), you will be presented with a cropped image from the wide angle lens instead. This was sacrilege to camera nerds, but Apple would argue that if the cropped image looks better in those low light situations, then that is the correct approach.

Results are impressive:

As you can see, the iPhone X required very little light before it decided to use the telephoto lens. The iPhone 7 Plus required quite a bit more. I used the app Light Meter to measure the light at each interval, which I denote in the video. The app measures the lux, which is a measure of illuminance equal to one lumen per square meter. (I measured from both devices and averaged the results, as the readings were slightly different. I wouldn’t expect an app to function as well as a true light meter, but this probably gets us in the ball park).

Make sure to check out the video to see the lens switching in action. The difference between the iPhone 7 Plus and the X is substantial when it comes to the amount of light required for the system to pick the telephoto lens.

Permalink

Focos: Powerful Depth Image Controls in a Fun Package

The iPhone's camera has long been one of its most important features. Every year when new models are introduced, it's a sure bet that camera improvements are part of the package. Last year that remained true, but it also proved an even more special year for the iPhone's camera setup. The introduction of dual rear-facing cameras with Portrait mode was something different – pictures no longer just looked a little better than on older iPhone models, they looked almost professional-quality.

This year, whether you picked up a new iPhone or not, Portrait mode is a better feature than before. Part of this is due to software improvements in iOS 11, but another key benefit is that third-party developers now have access to the depth information in Portrait photos. For the first time, Portrait images taken with the iPhone can be edited and enhanced in unique ways, and Focos is a new app that takes full advantage of that opportunity.

Read more


Can Clips Be a Modern Day Photo Booth?

Karan Varindani considers the potential of Apple's Clips to be a spiritual successor to Photo Booth:

With the iPad 2, back in early 2011, Apple brought Photo Booth to the iPad. I distinctly remember thinking that this was a no-brainer at the time. Growing up in Ghana, there weren’t that many Macs in my high school, but everybody that had one used Photo Booth. It was very regular to walk into the sixth form (senior year) common room and see groups of friends, myself included, behind a MacBook playing with the filters. Talking to several of my American friends, it sounds like it was the same deal here. I always thought that it was only a matter of time before Apple brought Photo Booth to the iPhone, but six years later it still just ships with Macs and iPads (and I don’t think that it’s been updated in that time).

Playing with the Selfie Scenes in Clips last week, I had the same feeling that I did playing with Photo Booth on my Mac many years ago. It was a little surreal, as someone with incredible front-camera shyness, to find myself having so much fun with it. The whole experience had me thinking: In a few years, once the Face ID technology has spread to the rest of the iOS line (and maybe even the Mac), could Clips be the successor to Photo Booth? Between Selfie Scenes, stickers, Live Titles, and fast sharing to social media, it seems the perfect fit.

I think the best modern equivalent of that Photo Booth social experience is Snapchat's lenses, which I've observed can consistently deliver laughter and interest among a group of friends or family members. While Clips' Selfie Scenes offer a similarly neat technical effect, if Apple is serious about being successful with the app, a couple big changes need to take place: the square orientation limit has to go, and Clips needs better hooks into apps like Instagram and Snapchat than the share sheet provides.

Photo Booth's prime was a very different era than where we are today, and without the aid of a true social network it will be hard for Apple to replicate its success. So far, Animoji seem much closer to meeting that goal than Clips.

Permalink


Apple’s Quest to Transform Photography

John Paczkowski of BuzzFeed conducted a fascinating interview with Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Johnnie Manzari of Apple's Human Interface Team about the iPhone’s camera. Much of the discussion is focused on the new Portrait Lighting feature available in the 8 Plus and X. As Paczkowski explains,

The camera's effects don't rely on filters. They're the result of Apple's new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It's all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized.

Manzari described the extensive process that went into creating Portrait Lighting:

"We spent a lot of time shining light on people and moving them around — a lot of time," Manzari says. "We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work."

BuzzFeed’s article is worth a close read because it’s about more than just the camera in Apple’s new and upcoming iPhones. The behind-the-scenes peek at the development process of the many functions that the iPhone’s camera serves is the best example of one of Apple’s biggest competitive advantages: the fusion of hardware and software.

Permalink