This Week's Sponsor:

Gamery

A sleek and intuitive game library app for casuals and pros


GoodNotes Adds Superior Multiwindowing, Dark Mode, and OCR Scanning

When Apple introduces a fundamental change to how apps interact with iOS (and now iPadOS), it’s always fascinating to see the different ways those changes are adopted by third-party developers. Although Apple sets a baseline for how new frameworks work, it has less control over how they are used, and it’s those implementation details that often have the greatest impact on users.

In the inaugural version of iPadOS, that’s been the case for multiwindowing. Many apps stick to the basic built-in ways to open multiple instances of the same app. However, the real power of multiwindowing is revealed by apps like GoodNotes, which goes the extra mile and provides an experience better than any other multiwindow app I’ve tried.

Dragging a PDF into a new window.Replay

What makes GoodNotes special is not just the sheer number of unique ways to open a second window for the app, though that’s impressive in itself, but the thoughtfulness of its execution. In every corner of the app where you might want to move a folder, document, page, or bit of data into a new window or document, you can. The result is a level of flexibility that even Apple’s own iWork suite doesn’t match, further cementing GoodNotes as my favorite app for taking handwritten notes.

Read more


Apple to Open Siri Up to Third-Party Messaging and Phone Apps

Bloomberg reports that Apple will open up Siri to third-party messaging apps with a software update later this year. Third-party phone apps will be added later. According to Bloomberg’s Mark Gurman:

When the software refresh kicks in, Siri will default to the apps that people use frequently to communicate with their contacts. For example, if an iPhone user always messages another person via WhatsApp, Siri will automatically launch WhatsApp, rather than iMessage. It will decide which service to use based on interactions with specific contacts. Developers will need to enable the new Siri functionality in their apps. This will be expanded later to phone apps for calls as well.

As Gurman notes, the company’s change in approach comes as Apple is facing scrutiny over the competitive implications of its dual role as app maker and App Store gatekeeper in the US and elsewhere.

It’s interesting that the update is a Siri-only change. Users will still not be able to replace Messages with WhatsApp or Phone with Skype as their default messaging and phone apps for instance, but it strikes me as a step in the right direction and a change that I hope leads to broader customization options on iOS and iPadOS.

Permalink

Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

Permalink

Apple Maps’ Northeast Expansion Visualized

Yesterday Apple began the full rollout of its new maps for the Northeastern United States. And right on the heels of that release, Justin O’Beirne has meticulously documented and compared Apple’s new work with the mapping data that it replaces. He begins by highlighting Apple’s progress toward launching its new maps throughout the U.S.:

With this latest expansion, Apple’s new map now covers 27.5% of the U.S.’s land area and almost half of its population (47.2%).

Apple has promised that its new maps would be available to all U.S. users by the end of 2019, so the rollout – which began last September with iOS 12 – will need to significantly accelerate from now through December.

The bulk of O’Beirne’s post lets the new maps speak for themselves. Pulling from significant locations all throughout the Northeast, O’Beirne showcases direct comparisons between the old and new maps. As has been seen in other parts of the U.S., Apple’s new maps feature significant expansions of vegetation, new details for features like ball fields and small parks and rivers, and even road enhancements. Apple’s Look Around feature, which competes with Google Street View, is also now available in some of these Northeastern locations, such as in New York City.

Maps in iOS 13 is a big update, particularly if you’re in an area that has the new Apple-designed maps. I’m excited to have the new maps arrive in New York City, and look forward to getting some hands-on time with them over the coming weeks.

Permalink

A Comprehensive Guide to All 120+ Settings URLs Supported by iOS and iPadOS 13.1

A few weeks ago, I came across a post on Reddit claiming that Apple had restored the ability to launch specific sections of the Settings app via Shortcuts in iOS and iPadOS 13.1. I was inspired by that discovery to finish working on a project I had long been putting off: documenting all the URLs supported by the Settings app in iOS and iPadOS.

After some a lot of trial and error, I’ve collected 120+ URLs that can open individual pages and sub-sections of the Settings app. In this post, I’m going to share the complete list of URLs that are supported as of iOS and iPadOS 13.1 (specifically, iOS 13.1.2), as well as a custom shortcut to launch them.

Read more


Pixelmator Photo Adds Direct iCloud Photo Library Access, Batch Editing, and New Export Features

Pixelmator Photo for the iPad has been released with a trio of new features that greatly increase the power of the app. With the update, you can now now edit images in your iCloud Photo Library non-destructively without creating duplicates. There are also new batch-processing workflows and better options for exporting images. It’s an interesting mix of updates that I expect will appeal to a wide audience even though there remain iPadOS features I’d like to see adopted in the future.

Read more



AppStories, Episode 132 – Apple Arcade: The Launch and Favorite Games

On this week’s episode of AppStories, we cover the all-new MacStories Shortcuts Icons, a set of 300 custom icons for shortcuts that you can add to your Home screen and then revisit Apple Arcade in the wake of its public launch discussing some of our favorite games so far.

Sponsored by:

  • Pingdom – Start monitoring your website performance and availability today, and get instant alerts when an outage occurs or a site transaction fails. Use offer code APPSTORIES to get 30% off.
  • Zapier – The easiest way to automate your work. Start your 14-day free trial today.
  • Direct Mail – Create and send great looking email newsletters with Direct Mail, an easy-to-use email marketing app designed exclusively for the Mac.

Permalink

Halide 1.14 Adds New Lens Switching Interface and Guides

Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.

Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.

From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.

You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.

With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.

If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:

Switching lenses in Halide.Replay

When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.

The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.

Halide is available on the App Store for $5.99.