This week's sponsor

Remote for Mac

Turn iPhone or iPad into remote control for your Mac


Smart Speakers and Speech Impairment

Steven Aquino covers an important accessibility angle of smart speakers that I've never truly considered:

Since the HomePod started shipping last week, I’ve taken to Twitter on multiple occasions to (rightfully) rant about the inability of Siri—and its competitors—to parse non-fluent speech. By “non-fluent speech,” I’m mostly referring to stutterers because I am one, but it equally applies to others, such as deaf speakers.

This is a topic I’ve covered before. There has been much talk about Apple’s prospects in the smart speaker market; the consensus seems to be the company lags behind Amazon and Google because Alexa and Google Home are smarter than Siri. What is missing from these discussions and from reviews of these products is the accessibility of a HomePod or Echo or Sonos.

As I see it, this lack of consideration, whether intentional or not, overlooks a crucial part of a speaker product’s story. Smart speakers are a unique product, accessibility-wise, insofar as the voice-first interaction model presents an interesting set of conditions. You can accommodate for blindness and low vision with adjustable font sizes and screen readers. You can accommodate physical motor delays with switches. You can accommodate deafness and hard-of-hearing with closed captioning and using the camera’s flash for alerts.

But how do you accommodate for a speech impairment?

A human assistant would know how to deal with stuttering, dialects, or even just the need to repeat a part of a sentence you got wrong. None of the modern digital assistants currently goes beyond being a slightly humanized command line activated by voice, and I wonder who will get there first.

Permalink

Unread Gains Inoreader Integration and Automatic Safari Reader Option

As John and I recently discussed on AppStories, there's never been a better time to use RSS. There are several powerful services to choose from (we both like Inoreader), and the selection of modern RSS clients for iOS is also solid these days.

Unread, now developed by Golden Hill Software, is one of the more elegant options based on comfortable gestures and a focused reading experience. I linked to the app in November when it received support for the iPhone X and iPad multitasking; in the post, I noted that I still wanted to see Unread gain Inoreader integration and haptic feedback on the latest iPhones.

With version 1.9 released today, Unread now fully supports Inoreader and plays subtle taps for different kinds of swipe gestures. Syncing subscriptions with Inoreader supports the full range of options available on the service: in addition to unread and starred articles, you can view your active searches, browse websites by folder, or open articles for individual subscriptions. Along with the aforementioned Taptic Engine integration, I also want to point out that Unread 1.9 supports one of the lesser known Safari View Controller features on iOS: in the app's settings, you can now enable an option to load Safari View Controller in Reader mode by default when you tap any link in the app.

Unread isn't my default RSS client (I use Fiery Feeds 2 now), but I keep the app on my devices when I feel like I want to take a more relaxed approach to reading articles I've starred.

Unread 1.9 is available on the App Store.

Permalink

A Year-Long Experiment Comparing the Best Map Navigation Services

We all have our own anecdotal reasons for thinking a certain map navigation service is best, but few of us are willing to perform a committed experiment that gathers enough data to prove our beliefs. Artur Grabowski, however, did just that.

In an experiment that began early last year and led to recording 120 different driving trips, Grabowski compared the big three mapping services: Apple Maps, Google Maps, and Waze. Though more complex studies could certainly be done, Grabowski kept things simple by focusing on answering only three questions:

  1. Which navigation app estimates the shortest travel time?
  2. How does each app over/underestimate travel times?
  3. Which navigation app actually gets you to your destination most quickly?

His results found that Waze estimated the shortest travel times, but that actually wasn't a good thing, because the service also had the least accurate estimates. Apple Maps estimated the longest times, but that resulted in it being more accurate than its competitors. Google Maps, meanwhile, most often produced the fastest actual travel times, with Apple Maps and Waze placing second and third, respectively.

Grabowski's tests are accompanied by the asterisk that his routes were all taken in the San Francisco Bay Area, where Apple Maps is likely at its strongest. Even so, the data he compiled over the year is fascinating to analyze, and shows just how competitive these services are with each other in the areas that matter most.

Permalink

Tim Cook Reflects on Apple’s Success in Fast Company Interview

Earlier this week, Fast Company released its annual ranking of the most innovative companies of the year. Apple scored the top spot, moving up from its fourth-place grade a year prior. In a follow-up piece, Robert Safian of Fast Company today published an exclusive interview with Tim Cook focusing on the company's success.

The whole interview's worth reading, but one segment of it stuck with me most. In response to a query regarding whether Cook views some years at Apple as better than others, the CEO replied that every year is a good year, because even if public launches aren't as exciting, there's always something big in the works behind the scenes.

Even when we were idling from a revenue point of view...those were some incredibly good years because you could begin to feel the pipeline getting better, and you could see it internally. Externally, people couldn’t see that. With the iPod, before it came out, we didn’t really know that it would become as big. But it was clear it was changing things in an incredibly good way. Of course with the iPhone it was clear that that was a huge change, a category definer, but who would’ve thought [it would have impact] to the degree that it [did].

Though the example isn't as extreme as the years leading up to launching the iPod or iPhone, one recent proof of what Cook's talking about is the contrast between Apple's 2016 and 2017. The former was viewed as a somewhat unexciting year by many of the company's closest followers. Major product launches included the iPhone SE, 9.7-inch iPad Pro, Apple Watch Series 2, iPhone 7, MacBook Pro with Touch Bar, and just barely squeaking into the calendar year, AirPods. It was a solid lineup to be sure, but many of the product updates felt more iterative than evolutionary, particularly when compared with the impressive year that followed.

In 2017 Apple introduced a low-budget iPad, new 10.5-inch and 12.9-inch iPad Pros, revisions across the entire MacBook lineup, the Apple Watch Series 3 with cellular, the Apple TV 4K, iPhone X and iPhone 8, the iMac Pro, and they took the veil off HomePod. It was the sort of strong year, hardware-wise, that you simply can't have every year.

From inside the company, however, it's easier to view every year as a good one – because regardless of what the world at large sees, you're working to build the future.

Permalink

Alto’s Odyssey Review: Desert Tranquility

The sky above the desert has chosen a peculiar, almost plum-like shade of purple tonight as I’m nimbly moving past tall silhouettes of cacti and palm trees, when I see the black contours of a rock. My experience tells me that, in most cases, rocks have to be avoided, so I jump. While airborne, I glance at the dune ahead of me, and decide to attempt a backflip. The sky in the distance is a sight to behold – a full moon, barely visible among the clouds, faintly illuminates a panorama of ancient ruins left to age and crumble. In fact, the horizon is so beautiful, I don’t see another rock waiting just ahead of me as soon as I stick the landing. I hit the rock and fall face down in the sand. It’s game over.

I try again.

An angry lemur is chasing me, probably because I, once a mountain shepherd and now a tourist with a sandboard and little knowledge of lemur manners, woke the creature who was resting in his hut. His only goal, apparently, is to attack me and stop my speedy exploration. But I just need to make it to the next chasm and leave him behind, shaking his tiny lemur fist at me as I backflip over the void. That shouldn't be too hard.

He’s fast though. Suddenly, I see a potential way out: a rushing water stream connects to a narrow wall, which I can vertically ride to hop onto a vine where I can grind, jump, and backflip to build up speed and escape the lemur. Seems easy enough. My jumps are precise and I elegantly make it onto the vine. But the lemur isn’t giving up – he’s right behind me. 10, 9, 7, 5 meters behind – he’s going to catch me. But we’re at the end of the vine now, and if I jump, I’m going to land and sprint. I take the leap and start my backflip. I think I made it. Except the lemur also jumps, grabs me, and I’m face down in the sand again. It's game over.

I keep trying.

Read more


Things 3.4 Brings Powerful New Automation Features and App Integrations

I switched to Things as my task manager late last year, sometime before the holidays. While I discussed this decision at length on Connected and AppStories, I didn't write about it on MacStories because I didn't want to rehash Ryan's in-depth review of Things 3. In terms of design and features, everything I like about Things is covered in his original review.

My personal motivation for switching to Things boils down to this: the way projects and the Today screen are designed in Things fits well with my schedule, and doesn't stress me out. Things has a "calm" interface that doesn't turn overdue tasks red, making me feel guilty; projects are clearly laid out with a tasteful use of San Francisco, and further organization can be applied to a project using headings, a feature I've never seen in any other task manager. And among dozens of thoughtful touches, Things' separation of Today and This Evening for the current day is absolutely in line with how I think about my typical day. In short: I want 2018 to be less stressful than last year, and Things is helping with the task management part.

That said, as someone who used 2Do and Todoist in the past and heavily automated them for integration with other apps, I was missing some solid automation options from Things. Cultured Code has offered a basic URL scheme in their iOS app for a few years now, but the supported commands never went beyond the ability to create basic, metadata-free tasks in the app.

This is changing today with Things 3.4, which I've had the opportunity to test and experiment with for the past couple of months. With this new version, the folks at Cultured Code have shipped one of the most powerful and versatile URL scheme action libraries seen in a task manager for iOS – comparable to Omni's work on OmniFocus, and, in a way, perhaps even more flexible.

The new Things URL scheme, which has been documented here, lets you create tasks, projects, show specific sections of the app, and search across your entire Things database. This may not sound too impressive on paper, but what sets this feature apart is the level of customization and detail that can be applied to every single parameter of every action. As a result, Things is now a first-class citizen of the iOS automation scene, and, within the limitations of iOS inter-app communication, its URL scheme unlocks several new possible integrations with apps and workflows.

Furthermore, Cultured Code wants to make it easy for third-party developers to natively support sending data to Things from their apps. Today, the company is also introducing a JSON-based command to allow more control when adding items to Things from external apps, and they've created a set of Swift helper classes that apps can use to easily generate the JSON needed to pass data to Things. As I'll demonstrate later in this article, a couple developers of two of my favorite iOS apps are already taking advantage of these capabilities to great effect.

As you can imagine, I've been busy experimenting with the new automation features of Things and identifying aspects of the app I wanted to speed up by integrating them with other apps. Below, you'll find a collection of the launchers and workflows I've put together for Things 3.4. These are the actions and scripts I find myself using the most on a daily basis, and which I believe considerably extend Things' capabilities on the iPhone and iPad. More advanced workflows will follow over the next couple of weeks (and months) exclusively for Club MacStories members in the Workflow Corner section of MacStories Weekly.

Let's dig in.

Read more


Homecam for Apple TV Adds Room-Based Accessory Details and Controls

I first covered Aaaron Pearce's Homecam app last week, noting how this is the app you need to have if you own multiple HomeKit cameras and want a dashboard to monitor them all at once.

I wrote:

Homecam displays a live grid of all the HomeKit-enabled cameras in your house. That's it. Instead of having to look for cameras in specific rooms or at the bottom of the main accessories list in Apple's Home app, you can launch HomeCam and see them all at once. Then, you can tap on a camera to watch live footage in full-screen, and optionally add a camera to the Today widget as well. Soon, you'll also be able to control accessories in the room where the camera is located.

Homecam's widget is particularly impressive as it can also display live footage from each camera without having to load the main app – and it works both over WiFi and with remote access on cellular connections. I can't recommend Homecam enough if you own multiple HomeKit cameras and have been looking for a quick way to switch between them.

While my story focused on iOS, Homecam is also available on the Apple TV, where it takes advantage of the big screen to let you keep an eye on even bigger real-time video feeds from HomeKit cameras around the house. And today, Homecam for tvOS received a substantial update with a feature I hope Pearce will soon bring to the iOS version as well: room-based accessory details and controls.

In the new Homecam for Apple TV, you can long-press on the Siri Remote to bring up a menu that lists controls for sensors, switches, and lights located in the same room as the camera. This allows you to, say, monitor your sleeping baby and progressively dim the lights, or remotely check on the state of any appliance connected to a smart plug. In addition, Homecam can now pull in details from sensors (for temperature, humidity, and air quality) and overlay them on the full-screen video, so you can see what the temperature is, turn the fan off, and visually confirm that it worked.

I've been testing this Homecam update for the past couple of days, and, while I continue to fundamentally dislike the Siri Remote's fiddly touchpad, it was fun to watch the dogs sleep on the sofa and wake them up by changing the color of the lights or turning the TV on and off (yes, we did apologize to them with treats afterwards). Pairing a live video feed from a HomeKit camera with data layers and controls from the same room makes perfect sense, and I hope this feature makes it to the iOS app soon.

Homecam for iOS and tvOS is available on the App Store at $4.99.

Permalink

Motherboard Offers Inside Look at iFixit’s iPhone X Journey

After several years of inquiries, Jason Koebler and the team at Motherboard were granted permission last November to follow iFixit on its journey of tearing down the latest iPhone. Video of that job, which included a plane ride from California to Australia for the iPhone X's launch, is now available.

iFixit is best known for its commitment to tearing down new tech as soon as it becomes available, but as the video documents, the company's primary mission is to equip people to repair their devices. Whether you've followed iFixit's work in the past or not, the video is a fascinating look into the efforts that fuel each device teardown.

Permalink