Federico Viticci

9048 posts on MacStories since April 2009

Federico is the founder and Editor-in-Chief of MacStories, where he writes about Apple with a focus on apps, developers, and iOS productivity. He founded MacStories in April 2009 and has been writing about Apple since. Federico is also the co-host of AppStories, a weekly podcast exploring the world of apps.

He can also be found on his three other podcasts on Relay FM – Connected, Canvas, and Remaster.

|

This week's sponsor

Remote for Mac

Turn iPhone or iPad into remote control for your Mac


The Evolution of Alto’s Odyssey

iMore's Serenity Caldwell and Luke Filipowicz put together a great Q&A with Team Alto to discuss the making of Alto's Odyssey.

Here's my favorite bit, which only adds to my love for this game:

One of the big breaks in the new game's theming came from the lives of Team Alto's members themselves. "Things really clicked when we reflected as a group on how much our lives had changed since the release of Alto's Adventure," wrote Cymet. "The team had grown, some of us had moved away from the homes we knew to live in other places, and we had all experienced big personal upheaval in different ways.

"What we arrived at was a desire to capture the feeling of going outside your comfort zone, exploring the unfamiliar, and accepting that the concept of 'home' is related to the people close to you, not any one place. In many ways, this is what led us to the grandeur of Alto's Odyssey's setting. This idea of a fantastical place far from what you know as home, where you learn to see the beauty in embracing the unknown."

Permalink

Connected, Episode 181: Headspaced Against My Will

Federico has a surprise for the group, then goes on to teach about Things 3 before Stephen complains about its repeating task support. In a shocking turn of events, Myke has the need for a Mac app.

I had a lot of fun on this week's episode of Connected. You don't want to miss the surprise I'd been keeping from the group. You can listen here.

Sponsored by:

  • Ting: A smarter way to do mobile. Save $25 on selected devices or keep it as Ting credit.
  • Pingdom: Start monitoring your websites and servers today. Use offer code CONNECTED to get 30% off.
  • Squarespace: Make your next move. Enter offer code WORLD at checkout to get 10% off your first purchase.
Permalink

Smart Speakers and Speech Impairment

Steven Aquino covers an important accessibility angle of smart speakers that I've never truly considered:

Since the HomePod started shipping last week, I’ve taken to Twitter on multiple occasions to (rightfully) rant about the inability of Siri—and its competitors—to parse non-fluent speech. By “non-fluent speech,” I’m mostly referring to stutterers because I am one, but it equally applies to others, such as deaf speakers.

This is a topic I’ve covered before. There has been much talk about Apple’s prospects in the smart speaker market; the consensus seems to be the company lags behind Amazon and Google because Alexa and Google Home are smarter than Siri. What is missing from these discussions and from reviews of these products is the accessibility of a HomePod or Echo or Sonos.

As I see it, this lack of consideration, whether intentional or not, overlooks a crucial part of a speaker product’s story. Smart speakers are a unique product, accessibility-wise, insofar as the voice-first interaction model presents an interesting set of conditions. You can accommodate for blindness and low vision with adjustable font sizes and screen readers. You can accommodate physical motor delays with switches. You can accommodate deafness and hard-of-hearing with closed captioning and using the camera’s flash for alerts.

But how do you accommodate for a speech impairment?

A human assistant would know how to deal with stuttering, dialects, or even just the need to repeat a part of a sentence you got wrong. None of the modern digital assistants currently goes beyond being a slightly humanized command line activated by voice, and I wonder who will get there first.

Permalink

Unread Gains Inoreader Integration and Automatic Safari Reader Option

As John and I recently discussed on AppStories, there's never been a better time to use RSS. There are several powerful services to choose from (we both like Inoreader), and the selection of modern RSS clients for iOS is also solid these days.

Unread, now developed by Golden Hill Software, is one of the more elegant options based on comfortable gestures and a focused reading experience. I linked to the app in November when it received support for the iPhone X and iPad multitasking; in the post, I noted that I still wanted to see Unread gain Inoreader integration and haptic feedback on the latest iPhones.

With version 1.9 released today, Unread now fully supports Inoreader and plays subtle taps for different kinds of swipe gestures. Syncing subscriptions with Inoreader supports the full range of options available on the service: in addition to unread and starred articles, you can view your active searches, browse websites by folder, or open articles for individual subscriptions. Along with the aforementioned Taptic Engine integration, I also want to point out that Unread 1.9 supports one of the lesser known Safari View Controller features on iOS: in the app's settings, you can now enable an option to load Safari View Controller in Reader mode by default when you tap any link in the app.

Unread isn't my default RSS client (I use Fiery Feeds 2 now), but I keep the app on my devices when I feel like I want to take a more relaxed approach to reading articles I've starred.

Unread 1.9 is available on the App Store.

Permalink

Alto’s Odyssey Review: Desert Tranquility

The sky above the desert has chosen a peculiar, almost plum-like shade of purple tonight as I’m nimbly moving past tall silhouettes of cacti and palm trees, when I see the black contours of a rock. My experience tells me that, in most cases, rocks have to be avoided, so I jump. While airborne, I glance at the dune ahead of me, and decide to attempt a backflip. The sky in the distance is a sight to behold – a full moon, barely visible among the clouds, faintly illuminates a panorama of ancient ruins left to age and crumble. In fact, the horizon is so beautiful, I don’t see another rock waiting just ahead of me as soon as I stick the landing. I hit the rock and fall face down in the sand. It’s game over.

I try again.

An angry lemur is chasing me, probably because I, once a mountain shepherd and now a tourist with a sandboard and little knowledge of lemur manners, woke the creature who was resting in his hut. His only goal, apparently, is to attack me and stop my speedy exploration. But I just need to make it to the next chasm and leave him behind, shaking his tiny lemur fist at me as I backflip over the void. That shouldn't be too hard.

He’s fast though. Suddenly, I see a potential way out: a rushing water stream connects to a narrow wall, which I can vertically ride to hop onto a vine where I can grind, jump, and backflip to build up speed and escape the lemur. Seems easy enough. My jumps are precise and I elegantly make it onto the vine. But the lemur isn’t giving up – he’s right behind me. 10, 9, 7, 5 meters behind – he’s going to catch me. But we’re at the end of the vine now, and if I jump, I’m going to land and sprint. I take the leap and start my backflip. I think I made it. Except the lemur also jumps, grabs me, and I’m face down in the sand again. It's game over.

I keep trying.

Read more


Things 3.4 Brings Powerful New Automation Features and App Integrations

I switched to Things as my task manager late last year, sometime before the holidays. While I discussed this decision at length on Connected and AppStories, I didn't write about it on MacStories because I didn't want to rehash Ryan's in-depth review of Things 3. In terms of design and features, everything I like about Things is covered in his original review.

My personal motivation for switching to Things boils down to this: the way projects and the Today screen are designed in Things fits well with my schedule, and doesn't stress me out. Things has a "calm" interface that doesn't turn overdue tasks red, making me feel guilty; projects are clearly laid out with a tasteful use of San Francisco, and further organization can be applied to a project using headings, a feature I've never seen in any other task manager. And among dozens of thoughtful touches, Things' separation of Today and This Evening for the current day is absolutely in line with how I think about my typical day. In short: I want 2018 to be less stressful than last year, and Things is helping with the task management part.

That said, as someone who used 2Do and Todoist in the past and heavily automated them for integration with other apps, I was missing some solid automation options from Things. Cultured Code has offered a basic URL scheme in their iOS app for a few years now, but the supported commands never went beyond the ability to create basic, metadata-free tasks in the app.

This is changing today with Things 3.4, which I've had the opportunity to test and experiment with for the past couple of months. With this new version, the folks at Cultured Code have shipped one of the most powerful and versatile URL scheme action libraries seen in a task manager for iOS – comparable to Omni's work on OmniFocus, and, in a way, perhaps even more flexible.

The new Things URL scheme, which has been documented here, lets you create tasks, projects, show specific sections of the app, and search across your entire Things database. This may not sound too impressive on paper, but what sets this feature apart is the level of customization and detail that can be applied to every single parameter of every action. As a result, Things is now a first-class citizen of the iOS automation scene, and, within the limitations of iOS inter-app communication, its URL scheme unlocks several new possible integrations with apps and workflows.

Furthermore, Cultured Code wants to make it easy for third-party developers to natively support sending data to Things from their apps. Today, the company is also introducing a JSON-based command to allow more control when adding items to Things from external apps, and they've created a set of Swift helper classes that apps can use to easily generate the JSON needed to pass data to Things. As I'll demonstrate later in this article, a couple developers of two of my favorite iOS apps are already taking advantage of these capabilities to great effect.

As you can imagine, I've been busy experimenting with the new automation features of Things and identifying aspects of the app I wanted to speed up by integrating them with other apps. Below, you'll find a collection of the launchers and workflows I've put together for Things 3.4. These are the actions and scripts I find myself using the most on a daily basis, and which I believe considerably extend Things' capabilities on the iPhone and iPad. More advanced workflows will follow over the next couple of weeks (and months) exclusively for Club MacStories members in the Workflow Corner section of MacStories Weekly.

Let's dig in.

Read more


Homecam for Apple TV Adds Room-Based Accessory Details and Controls

I first covered Aaaron Pearce's Homecam app last week, noting how this is the app you need to have if you own multiple HomeKit cameras and want a dashboard to monitor them all at once.

I wrote:

Homecam displays a live grid of all the HomeKit-enabled cameras in your house. That's it. Instead of having to look for cameras in specific rooms or at the bottom of the main accessories list in Apple's Home app, you can launch HomeCam and see them all at once. Then, you can tap on a camera to watch live footage in full-screen, and optionally add a camera to the Today widget as well. Soon, you'll also be able to control accessories in the room where the camera is located.

Homecam's widget is particularly impressive as it can also display live footage from each camera without having to load the main app – and it works both over WiFi and with remote access on cellular connections. I can't recommend Homecam enough if you own multiple HomeKit cameras and have been looking for a quick way to switch between them.

While my story focused on iOS, Homecam is also available on the Apple TV, where it takes advantage of the big screen to let you keep an eye on even bigger real-time video feeds from HomeKit cameras around the house. And today, Homecam for tvOS received a substantial update with a feature I hope Pearce will soon bring to the iOS version as well: room-based accessory details and controls.

In the new Homecam for Apple TV, you can long-press on the Siri Remote to bring up a menu that lists controls for sensors, switches, and lights located in the same room as the camera. This allows you to, say, monitor your sleeping baby and progressively dim the lights, or remotely check on the state of any appliance connected to a smart plug. In addition, Homecam can now pull in details from sensors (for temperature, humidity, and air quality) and overlay them on the full-screen video, so you can see what the temperature is, turn the fan off, and visually confirm that it worked.

I've been testing this Homecam update for the past couple of days, and, while I continue to fundamentally dislike the Siri Remote's fiddly touchpad, it was fun to watch the dogs sleep on the sofa and wake them up by changing the color of the lights or turning the TV on and off (yes, we did apologize to them with treats afterwards). Pairing a live video feed from a HomeKit camera with data layers and controls from the same room makes perfect sense, and I hope this feature makes it to the iOS app soon.

Homecam for iOS and tvOS is available on the App Store at $4.99.

Permalink

The Reliable Simplicity of AirPods

Chris Welch, writing for The Verge on AirPods' advantage over other wireless earbuds:

AirPods are the best truly wireless earbuds available because they nail the essentials like ease of use, reliability, and battery life. There are alternatives that definitely_ sound_ better from Bose, B&O Play, and other. But they often cost more and all of them experience occasional audio dropouts. AirPods don’t. I’d argue they’re maybe the best first-gen product Apple has ever made. Unfortunately, I’m one of the sad souls whose ears just aren’t a match for the AirPods — and I’m a nerd who likes having both an iPhone and Android phone around — so I’ve been searching for the best non-Apple option.

But some 14 months after AirPods shipped, there’s still no clear cut competitor that’s truly better at the important stuff. They all lack the magic sauce that is Apple’s W1 chip, which improves pairing, range, and battery life for the AirPods. At this point I think it’s fair to say that Bluetooth alone isn’t enough to make these gadgets work smoothly. Hopefully the connection will be more sturdy once more earbuds with Bluetooth 5 hit the market. And Qualcomm is also putting in work to help improve reliability.

I haven't tested all the wireless earbuds Welch has, but I have some anecdotal experience here.

A few months ago, I bought the B&O E8 earbuds on Amazon. After getting a 4K HDR TV for Black Friday (the 55-inch LG B7), I realized that I wanted to be able to watch a movie or play videogames while lying in bed without having to put bulky over-ear Bluetooth headphones on. Essentially, I wanted AirPods for my TV, but I didn't want to use the AirPods that were already paired with my iPhone and iPad. I wanted something that I could take out of the case, put on, and be done with. So instead of getting a second pair of AirPods, I decided to try the E8.

I like the way the E8 sound and I'm a fan of the Comply foam tips. The case is elegant (though not as intuitive as the AirPods' case) and the gestures can be confusing. My problem is that, despite sitting 3 meters away from the TV, one of the earbuds constantly drops out. I sometimes have to sit perfectly still to ensure the audio doesn't cut out – quite often, even turning my head causes the audio to drop out in one of the E8. I'm still going to use these because I like the freedom granted by a truly wireless experience and because I've found the ideal position that doesn't cause audio issues, but I'm not a happy customer. Also, it's too late to return them now.

A couple of days ago, I was doing chores around the house. I usually listen to podcasts with my AirPods on if it's early and my girlfriend is still sleeping, which means I leave my iPhone in the kitchen and move around wearing AirPods. At one point, I needed to check out something outside (we have a very spacious terrace – large enough for the dogs to run around) and I just walked out while listening to a podcast.

A couple of minutes later, the audio started cutting out. My first thought was that something in Overcast was broken. It took me a solid minute to realize that I had walked too far away from the iPhone inside the house. I'm so used to the incredible reliability and simplicity of my AirPods, it didn't even occur to me that I shouldn't have left my iPhone 15 meters and two rooms away.


New iOS Apps Must Be Built with iOS 11 SDK and Support iPhone X Display Starting in April

Juli Clover, writing for MacRumors:

Apple today sent out a notice to developers letting them know that starting in April of 2018, all new apps submitted to the App Store must be built using the iOS 11 SDK, which is included in Xcode 9 or later.

Furthermore, Apple says that all new apps designed for the iPhone, including universal apps, must support the iPhone X's Super Retina display.

"Must be built", unfortunately, doesn't mean apps have to support new features like drag and drop. Speaking of which, I don't think supporting the native resolution of the 12.9-inch iPad Pro is a requirement yet, and the device launched in November 2015.

Permalink