As reported today by Juli Clover at MacRumors, Apple is now allowing iCloud Photos users to transfer a copy of their data to Google Photos. This joins the growing suite of tools provided on Apple’s Data and Privacy webpage, which also include downloading copies of your data, correcting your data, and deactivating or deleting your account.
As described in a new Apple Support document on this topic, the iCloud Photos transfer process does not delete your photos and videos from iCloud, it just copies a duplicate of the data to Google Photos. Clover notes:
The transfer process takes between three and seven days, with Apple verifying that the request was made by you. To do the transfer, you must have two-factor authentication turned on for your Apple ID account and you must have a Google Photos account with enough storage to complete the transfer.
Further details over at MacRumors.
BestPhotos is a streamlined photo management app for iOS and iPadOS, which we last covered a couple years ago. The app seeks to help users clean up their photo libraries using smart searches, photo comparisons, and metadata edits. Today marks the release of BestPhotos 3.0, a major update in which developer Eric Welander has added a variety of useful new features and improvements.
Pixelmator is one of the few apps I’ve used longer than anything else. Over time I regularly change up task managers, email clients, note-taking apps, and more, but nothing has ever come close to replacing Pixelmator for me. In fact, last summer I had to try living without it for a time while the app was briefly incompatible with the iOS 13 beta. Try as I might, I could find no replacement for the excellent layer-based image editor.
Not even Pixelmator Photo, the more modern photo editing tool, could replace the original Pixelmator. It wasn’t exactly intended to, since the apps specialize in different areas, but the standard Pixelmator nonetheless felt light on meaningful updates even before the release of Pixelmator Photo. My fear was that eventually the app would be discontinued.
Pixelmator 2.5, launching today, is strong evidence that that’s not going to happen. By transitioning the app to the Files document browser, designing an all-new photo browser, and adding a rich collection of new image size presets, Pixelmator’s team has crafted the app’s biggest leap forward in years and set it up for a strong future.
The Gnarbox 2.0 is ruggedized, portable SSD storage designed with photographers and videographers in mind. I’ve tried lots of different portable storage solutions in the past, and what distinguishes the Gnarbox is its ability to operate as a standalone device and as an accessory to a computing device. The mix of fast, rugged storage, an onboard operating system, wired and wireless connectivity, and complimentary software isn’t cheap. The entry-level Gnarbox is $499. However, the Gnarbox offers both the peace of mind of in-the-field backups and image and video pre-processing, making it a compelling choice for anyone who captures lots of photos and video while away from their main computing device.
Versatility and reliability are what you’re buying when you get a Gnarbox. I’ve tried other WiFi-enabled backup solutions, including Western Digital’s My Passport Wireless SSD, but in the weeks that I’ve been using a 256GB Gnarbox 2.0 that the company sent for testing, I’ve found that it’s built better, is more capable, and is easier to use than any other portable storage I’ve tried.
About this time last year, Apple announced its first-ever ‘Shot on iPhone’ photography challenge judged by a panel of professional photographers and Apple employees. Apple is back with a new contest app this year asking users to submit their Night mode photos.
Through January 29th, Apple is taking submissions on Instagram, Twitter, and Weibo. To qualify, post your photos on Instagram or Twitter with the hashtag #ShotoniPhone and #NightmodeChallenge and Weibo using #ShotoniPhone# and #NightmodeChallenge#.
Five winners will be picked by a panel of judges that include:
plus the following Apple executives and employees:
- Phil Schiller
- Kaiann Drance
- Brooks Kraft
- Jon McCormack
- Are Duplessis
The five winning photos will be announced on March 4th on the Apple Newsroom. Apple says the images may also be used in digital campaigns, at stores, on billboards, and in photo exhibitions.
Night mode photography was a big part of Federico’s story on iPhone 11 Pro photography called Eternal City, Modern Photography: The iPhone 11 Pro in Rome. Here’s an outtake from that story that Federico submitted for the challenge:
For more on the contest and tips on shooting Night mode photos, check out Apple’s press release.
The Loupedeck+ is a hardware control panel for editing photos and video that transforms the software tools you’re accustomed to using with a mouse or trackpad into physical buttons, knobs, and dials. By eliminating the need to dive into menus and hunt for software, the Loupedeck+ changes the image editing process into something much closer to the feeling of editing on an iPad with the Apple Pencil. The seemingly endless series of swipes, drags, and clicks are replaced by something far more tactile and natural.
The result is a clear example of the benefit of using a dedicated tool for a particular task. Photo and video editing is often a high-volume, high-precision activity with lots of repetition, and depending on your job, tight deadlines. That makes any tool that can shave a little time off of editing each photo a win for professionals who often edit thousands of images in a week.
What I didn’t expect, though, is that the Loupedeck+ also makes editing more accessible for beginners like myself. As I’ll explain in more detail below, when Loupedeck sent me their device to test, I spent most of my time using it in Adobe Lightroom Classic, which I hadn’t used before. However, after a short time familiarizing myself with the Loupedeck+ layout, I found myself deep in the editing process with my eyes fixed on the images I was working on instead of darting back and forth hunting for the tools I wanted to use.
I may never enjoy the sort of time savings that a professional photographer could squeeze out of the Loupedeck+. However, simply knowing that I can dip in and out of Lightroom Classic for my editing needs with virtually no learning curve eliminates a significant hurdle that has slowed me down in the past. Although there are aspects of the Loupedeck+ that could be improved, it’s an incredibly powerful tool that fits into more workflows than I anticipated, which makes it an accessory worth considering for a wide range of users.
Metapho has been one of my favorite photo utilities on iOS for years. The marquee feature has always been its ability to strip metadata from images, which is handy when sharing photos online, for instance. Over time though, Metapho has grown to incorporate other functionality for inspecting and editing photo metadata that has made the app a must-have iOS utility. With its latest update, Metapho has added Deep Fusion and Night Mode photo detection, an intriguing addition that I haven’t seen any other app offer.
In a video shared earlier today, Tom Hogarty, who’s a Lightroom product manager at Adobe, demonstrated an upcoming feature of Lightroom for iPad – the ability to import photos from external devices (such as cameras, drives, or SD cards connected over USB-C) into Lightroom’s library without copying them to the Photos app first.
Here’s how it’s going to work:
The workflow looks very nice: an alert comes up as soon as an external device is detected, photos are previewed in a custom UI within Lightroom (no more Photos overlay) and they’re copied directly into the app. I think anyone who uses Lightroom for iPad to edit photos taken with a DSLR is going to appreciate this addition. Keep in mind that the 2018 iPad Pros support up to 10 Gbps transfers over USB-C, which should help when importing hundreds of RAW files into Lightroom.
Direct photo import from external USB storage devices was originally announced by Apple at WWDC 2019 as part of the “Image Capture API” for iPadOS. When I was working on my iOS and iPadOS 13 review, I searched for documentation to cover the feature, but I couldn’t find anything on Apple’s website (I wasn’t the only one). Eventually, I just assumed it was part of the functionalities Apple delayed until later in the iOS 13 cycle. It turns out that this feature was quietly introduced by Apple with iOS and iPadOS 13.2, as also suggested by Hogarty in the Lightroom video.
According to this thread on StackOverflow, direct photo import is part of the ImageCaptureCore framework, which is now also available for iOS and iPadOS. I still can’t find any documentation for it on Apple’s developer website.
According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.
Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:
The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.
This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.
Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”
There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.
I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.