Posts in Linked

AppStories, Episode 85 – Personalizing Our iOS Setups

On this week’s episode of AppStories, we talk about some of the ways we personalize and tweak our iOS setups, including wallpapers, widgets, control center, share sheets, app icons, and more.

Sponsored by:

  • Luna Display - The only hardware solution that turns your iPad into a wireless display for your Mac. Use promo code APPSTORIES at checkout for 10% off.
  • Mack Weldon - Smart underwear for smart guys. Get 20% off your first order with the code APPS.

Permalink

Apple Adds Videos to Developer Portal Featuring the New iPad Pro and Pencil

To help developers take advantage of the latest features of the iPad Pro and Apple Pencil, Apple has posted two new videos to its developer portal. Even if you aren’t a developer though, the videos include interesting insights on some of the unique features of the new iPad Pros.

The videos cover development and design issues that should be considered when adapting apps to the new iPad Pros like using safe area insets to avoid crowding content into the rounded corners or under the home indicator. Another consideration to take into account is that unlike the previous iPads, the 11” iPad Pro doesn’t have a 4:3 aspect ratio, which means apps hard-coded to those dimensions will have areas cut off at the top and bottom.

Also, apps that don’t link against the iOS 12.1 SDK will run in a compatibility mode when multi-tasking, which will add an inset at the top and bottom of the screen for both apps instead of running them fullscreen. Apple says that making sure iPad apps can handle the inset compatibility mode will also help with bringing iOS apps to the Mac in 2019.

The Apple Pencil has a set of default double-tap settings that we covered in our iPad overview, but developers have the option to customize the double tap action in their apps. Apple also encourages developers not to hide functionality behind the gesture or turn custom actions on by default.

The videos cover the iPad Pro’s new USB-C connector too. The iPad Pro supports HDR 4K up to 60Hz and external displays up to 5K as well as USB audio devices, Ethernet, and MIDI. The iPad Pro can also send simultaneous USB-C outputs, which permits uses like connecting a DSLR Camera and 5K display to the iPad Pro at the same time.

The new videos are available as part of Apple’s Tech Talk series.


You can also follow all of our Apple event coverage through our October 30, 2018 hub, or subscribe to the dedicated October 30, 2018 RSS feed.

Permalink

Halide Developer Ben Sandofsky Breaks Down How the iPhone XR Captures Depth Data

Ben Sandofsky from the team that makes the Halide iOS camera app has a detailed post on the iPhone XR’s camera and how Apple creates Portrait Mode photos with a single lens. Sandofsky walks through how Apple uses Focus Pixels to develop a rough Disparity Map and combines that with a Portrait Effects Matte to create Portrait Mode images.

The results have some advantages, but also distinct disadvantages compared to the iPhone XS’s camera. As Sandofsky explains:

It seems the iPhone XR has two advantages over the iPhone XS: it can capture wider angle depth photos, and because the wide-angle lens collects more light, the photos will come out better in low light and have less noise.

However:

…most of the time, the XS will probably produce a better result. The higher fidelity depth map, combined with a focal length that’s better suited for portraiture means people will just look better, even if the image is sometimes a bit darker. And it can apply Portrait effects on just about anything, not just people.

Although Apple’s Camera app can only take Portrait Mode photos of people on the iPhone XR, the upcoming Halide 1.11 update will combine the XR’s Disparity Map and Halide’s own blur effect to apply a similar effect beyond human subjects. Sandofsky admits that the feature isn’t perfect due to the low quality of the Disparity Map created by the XR, but the photos included in his post show that it can take excellent pictures under some conditions.

It’s remarkable what is being done to squeeze depth information out of the XR’s single lens and instructive to understand how the underlying technology works. It’s also apparent that Apple has made significant advancements since the introduction of the first dual-lens cameras.

Permalink

The Journalists and Processes That Drive Apple News

Today Jack Nicas of The New York Times published a first-of-its-kind in-depth look behind the curtain of how the Apple News journalistic team operates. The piece highlights Apple’s distinct handling of the news, where human curation is a larger driving factor than at other major tech companies. Nicas writes:

Apple has waded into the messy world of news with a service that is read regularly by roughly 90 million people. But while Google, Facebook and Twitter have come under intense scrutiny for their disproportionate — and sometimes harmful — influence over the spread of information, Apple has so far avoided controversy. One big reason is that while its Silicon Valley peers rely on machines and algorithms to pick headlines, Apple uses humans like Ms. Kern.

The former journalist has quietly become one of the most powerful figures in English-language media. The stories she and her deputies select for Apple News regularly receive more than a million visits each.

Lauren Kern, the editor in chief of Apple News, heads a staff of journalists that span the globe. One of their chief responsibilities is selecting each day’s top stories for the app.

Ms. Kern leads roughly 30 former journalists in Sydney, London, New York and Silicon Valley. They spend their days consuming news across the internet, fielding 100 to 200 pitches a day from publishers, and debating which stories get the top spots.

Ultimately, they select five stories to lead the app, with the top two also displayed in a prominent window to the left of the iPhone home screen. They also curate a magazine-style section of feature stories. The lineup typically shifts five or more times a day, depending on the news.

Read more


Twitter Testing Reply Buttons, Presence Indicators, Ice Breakers, and More

In an interesting move that highlights Twitter’s recent efforts to develop its product more openly, soliciting feedback from its user base, the company invited The Verge’s Casey Newton to a meeting where it shared details on a handful of new features it’s working on.

A new design that more clearly indicates how to reply to tweets was one project in the works. Current prototypes resemble the sort of UI found on Facebook and Instagram, with a reply button indented underneath tweets. Related to that change, and in another modeling of other popular social networks, the UI for threads is being worked on to better resemble a conversation rather than a string of individual tweets.

One of my favorite ideas from those shared with The Verge was something Twitter calls “ice breakers.” Newton writes:

Another feature Twitter is considering is a twist on the pinned tweet designed to promote conversations. The company showed me a design that would let you pin an “ice breaker” to the top of your profile to let people know you wanted to talk about something specific. The company’s design director, Mike Kruzeniski, told me it could help Twitter users channel their followers’ enthusiasm into discussions they wanted to have — whether it be about a new project, a current event, or some other item of interest.

The current implementation of pinned tweets is fine, but I love the idea of conversation starter tweets that can be changed up over time as users’ interests and desires for connection change.

Finally, presence indicators and status indicators are two similar features that would, respectively, let the world know when you’re online and ready for conversation, and share a status within your status such as “at WWDC19.” While I’m generally not a fan of presence indicators, according to Newton Twitter’s will be entirely optional, which I appreciate. If the feature were used sparingly enough, it could be a replacement for the standard AMA (“ask me anything”) tweets that are common when a user is free and open for conversation. Most likely though, Twitter will activate presence indicators by default for all users, making that specific use case doubtful.

None of these changes are set in stone, so be sure to make your voice heard if you feel strongly about any of them. As Twitter’s Sara Haider told Newton:

“Coming up with a product in a silo and dumping it on people is not going to work,” she said. “Some people are going to love it. Some people are going to hate it. We want to understand what people’s feedback is, and then tweak and iterate on the product.”

Permalink

Connected, Episode 214: The Californian Idea of Food

Stephen was wrong, and Myke demands an apology before explaining what makes up dust. The FileMaker world is considered, then Federico explains why he thinks the ????emoji is wrong. Lastly, Adobe and Palm are both in the news.

Last week’s episode of Connected was a fun one – we discussed food emoji, Photoshop on iPad, and the Shortcuts 2.1 beta. You can listen here.

Sponsored by:

  • Squarespace: Make your next move. Enter offer code CONNECTED at checkout to get 10% off your first purchase.
  • Luna Display: The only hardware solution that turns your iPad into a wireless display for your Mac. Use promo code CONNECTED at checkout for 10% off.
  • PayPal: When it comes to growing your business, PayPal is your payments partner for today and tomorrow.
Permalink


Phil Schiller on the iPhone XR

Phil Schiller, Apple’s Senior Vice President of Worldwide Marketing, was interviewed by Engadget about the iPhone XR, which will begin arriving on doorsteps and in stores on Friday.

Engadget’s Chris Velazco asked Schiller about the meaning behind the new iPhone’s ‘XR’ moniker. According to Schiller, XR doesn’t stand for anything in particular, but he associates XR with cars:

“I love cars and things that go fast, and R and S are both letters used to denote sport cars that are really extra special,” he said with a smile.

Velazco, who clears up some confusion about the screen, which detects and tracks touches at 120Hz but doesn’t refresh at that rate, also asked about criticisms that the LCD panel in the XR isn’t as high-resolution as some other premium mobile phones. Schiller responded:

“I think the only way to judge a display is to look at it,” he told me, adding that Apple calls these screens “retina displays” because your eye can’t discern individual pixels unless you press your face up right against the glass. “If you can’t see the pixels, at some point the numbers don’t mean anything. They’re fairly arbitrary.”

I’m looking forward to trying the XR, especially the camera. I ordered one for my youngest son, which arrives Friday. One of the conditions when my kids get an Apple product that I don’t buy for myself is that I get first crack at it for MacStories, so I plan to do some side-by-side photo tests with the XR and XS Max soon. If the early indications are correct, I expect the XR will hold up reasonably well to the dual-lens models.

For more from Schiller about the XR, be sure to check out Velazco’s full interview on Engadget.

Permalink

Why Photoshop on iOS Is A Huge Win for the iPad Pro

Jason Snell, writing for Macworld on the announcement of Photoshop coming to iPad next year:

Adobe’s move to iPad instantly makes everyone who knows, loves, or relies on Photoshop a possible candidate for an iPad Pro. And make no mistake, the iPad Pro is already plenty powerful enough to run Photoshop, and with the Apple Pencil it’s got an input method that will satisfy most graphics pros. Even better, Adobe has said that it will be building in cloud syncing for Photoshop files, so that you’ll be able to seamlessly hand off projects directly from one device to another.

A lot of the criticism of the iPad Pro as a flawed tool for doing real work comes down to software. The hardware is capable—but where’s the software? These arguments have been weak for a while now—I think Microsoft Office for iPad is aces, and Apple’s iWork apps are remarkably capable, too—but with every major app that arrives on the platform, the quieter that criticism has to get. Adobe’s also bringing a simplified version of Premiere, called Premiere Rush, to the iPad. I wonder if Apple’s considering just how Final Cut and Logic might work on the iPad?

As I’ve been arguing for a while now, I believe we’re witnessing a shift in how tech companies – both platform owners and development studios – approach desktop and tablet software. Multiple factors – from better-looking displays and more powerful GPUs to cloud-based file management and subscriptions – are converging to make it possible to have a consistent app experience on every device you have without compromise. In this transition, iPad versions of desktop apps will be treated less like “companion” apps to a “real” desktop one and more like the same app, with the same features, optimized for touch and capable of adapting to the kind of computer it is running on (and adaptivity becomes especially important when you start considering external display output, for instance).

Photoshop, as Snell writes, is a first step. If Apple is truly pushing this vision forward, perhaps it’s time they also start treating the iPad as a place for real pro apps, not just companion utilities of macOS apps.

Permalink