This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


Posts tagged with "photos"

Adobe Previews Direct Photo Import from External Storage Coming to Lightroom for iPad

In a video shared earlier today, Tom Hogarty, who’s a Lightroom product manager at Adobe, demonstrated an upcoming feature of Lightroom for iPad – the ability to import photos from external devices (such as cameras, drives, or SD cards connected over USB-C) into Lightroom’s library without copying them to the Photos app first.

Here’s how it’s going to work:

The workflow looks very nice: an alert comes up as soon as an external device is detected, photos are previewed in a custom UI within Lightroom (no more Photos overlay) and they’re copied directly into the app. I think anyone who uses Lightroom for iPad to edit photos taken with a DSLR is going to appreciate this addition. Keep in mind that the 2018 iPad Pros support up to 10 Gbps transfers over USB-C, which should help when importing hundreds of RAW files into Lightroom.

Direct photo import from external USB storage devices was originally announced by Apple at WWDC 2019 as part of the “Image Capture API” for iPadOS. When I was working on my iOS and iPadOS 13 review, I searched for documentation to cover the feature, but I couldn’t find anything on Apple’s website (I wasn’t the only one). Eventually, I just assumed it was part of the functionalities Apple delayed until later in the iOS 13 cycle. It turns out that this feature was quietly introduced by Apple with iOS and iPadOS 13.2, as also suggested by Hogarty in the Lightroom video.

According to this thread on StackOverflow, direct photo import is part of the ImageCaptureCore framework, which is now also available for iOS and iPadOS. I still can’t find any documentation for it on Apple’s developer website.

Permalink

Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

Permalink

Pixelmator Photo Adds Direct iCloud Photo Library Access, Batch Editing, and New Export Features

Pixelmator Photo for the iPad has been released with a trio of new features that greatly increase the power of the app. With the update, you can now now edit images in your iCloud Photo Library non-destructively without creating duplicates. There are also new batch-processing workflows and better options for exporting images. It’s an interesting mix of updates that I expect will appeal to a wide audience even though there remain iPadOS features I’d like to see adopted in the future.

Read more


Halide 1.14 Adds New Lens Switching Interface and Guides

Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.

Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.

From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.

You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.

With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.

If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:

Switching lenses in Halide.Replay

When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.

The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.

Halide is available on the App Store for $5.99.


Pixelmator Pro for Mac Adds an Apple Photos Extension, New Zoom Tools, and Other Features

Pixelmator Pro released a significant update today that integrates the Mac app with Apple’s Photos app using a full-featured extension, adds enhanced zoom functionality, new cropping options, and more.

Pixelmator Pro is one of my favorite image editors on the Mac, but until today, I typically edited images in it that I pulled from an SD card or exported from Photos. That’s no longer necessary because Pixelmator Pro has been updated with a Photos extension that lets you use all of the app’s tools directly within Photos.

Photos extensions allow third-party developers to integrate their apps with Photos. Few of the photo editing apps I’ve used work with Photos though, which is a shame because Pixelmator Pro shows how much can be done with a Photos extension.

Read more


Snapthread 2.0 Adds a Refined UI, an Improved iPad Experience, and New Tools

When I first covered Snapthread early last year, you could tell where it was headed. The app was conceived initially by developer Becky Hansmeyer as a way to combine Snapchat videos. By last January though, the app had evolved into a general-purpose solution for quickly and easily stitching together Live Photos, still photos, and short videos that could be shared on any social network or directly with friends and family. With version 2.0, which is out today, Hansmeyer has refined the existing user experience, added useful new functionality without complicating the app, and leveraged the iPad to create a more versatile video creation tool that works equally well for quickly sharing your creations on social networks as it does with small groups of friends and family.

Read more


Spectre: A Computational Approach to Long-Exposure iPhone Photography

Spectre is a new specialized camera app from the team that created Halide, one of our favorite camera apps on iOS. The Halide team describes Spectre as a computational shutter for the iPhone, which allows the app to do things like remove people from a crowded scene, create artistic images of rushing water, and produce light trails at night. The same sort of images can be created using traditional cameras, but getting the exposure right, holding the camera absolutely still, and accounting for other factors make them difficult to get right. With Spectre, artificial intelligence is used to simplify the process and make long-exposure photography accessible to anyone with an iPhone.

Read more


Obscura Updated with Histogram Support and a New Image Viewer

Obscura 2.0 was one of our favorite iOS app updates of 2018, which garnered it a MacStories Selects Best App Update Runner-Up award. Since it was released eight months ago, developer Ben McCarthy has continued to refine the app and add new features such as iPad support, new editing controls, localizations, and a Photos extension. With the latest update out today, Obscura has added a histogram visualizer, a redesigned image viewer, iPad keyboard shortcuts, and more.

Read more


Apple Announces ‘Shot on iPhone’ Photography Challenge

For longer than I can remember now, Apple has plucked photos taken by iPhone users from the vast sea of images posted online and featured them on billboards, in print advertising, and online. Today, the company announced that through February 7th, it’s running a Shot on iPhone Challenge. Apple says that:

A panel of judges will review worldwide submissions and select 10 winning photos, to be announced in February. The winning photos will be featured on billboards in select cities, Apple retail stores and online.

Apple’s announcement also introduces the contest’s 11 judges, 6 of whom are from Apple and 5 of whom are outsiders with backgrounds in photography. The Apple team includes Phil Schiller, members of his marketing team, and others who work on Apple’s photo software. The rest of the judges include former White House photographer Pete Souza, travel photographer Austin Mann, who we recently interviewed for Club MacStories, Annet de Graaf, a travel photographer and author of iPhone photography books, Luisa Dörr who shot TIME magazine’s special Firsts issue entirely on an iPhone, and Chen Man, a visual artist and creator of photography-based social apps.

If you’d like to submit your iPhone photos, here’s what to do:

Post your best photo taken on iPhone to Instagram or Twitter with the #ShotOniPhone hashtag to participate in the the Shot on iPhone Challenge. Weibo users can participate as well using #ShotOniPhone#. In the image caption, note which model was used. Alternatively, you can also submit the photo in its highest resolution to shotoniphone@apple.com with the file format ‘firstname_lastname_iphonemodel.’ Photos can be straight from the camera, edited through Apple’s editing tools in the Photos app or with third-party software. Submissions for photos begins at 12:01 a.m. PST on January 22 and ends at 11:59 p.m. PST on February 7. You must be 18 years of age or older to participate, and this challenge is not open to Apple employees or their immediate families.

I’ve always enjoyed Apple’s Shot on iPhone ad campaigns. It will be fun to see which shots its judges pick from what I can only imagine will be a huge number of submissions.