In my Future of Workflow article from last year (published soon after the news of Apple’s acquisition), I outlined some of the probable outcomes for the app. The more optimistic one – the “best timeline”, so to speak – envisioned an updated Workflow app as a native iOS automation layer, deeply integrated with the system and its built-in frameworks. After studying Apple’s announcements at WWDC and talking to developers at the conference, and based on other details I’ve been personally hearing about Shortcuts while at WWDC, it appears that the brightest scenario is indeed coming true in a matter of months.
On the surface, Shortcuts the app looks like the full-blown Workflow replacement heavy users of the app have been wishfully imagining for the past year. But there is more going on with Shortcuts than the app alone. Shortcuts the feature, in fact, reveals a fascinating twofold strategy: on one hand, Apple hopes to accelerate third-party Siri integrations by leveraging existing APIs as well as enabling the creation of custom SiriKit Intents; on the other, the company is advancing a new vision of automation through the lens of Siri and proactive assistance from which everyone – not just power users – can reap the benefits.
While it’s still too early to comment on the long-term impact of Shortcuts, I can at least attempt to understand the potential of this new technology. In this article, I’ll try to explain the differences between Siri shortcuts and the Shortcuts app, as well as answering some common questions about how much Shortcuts borrows from the original Workflow app. Let’s dig in.
App Shortcuts and Suggestions
There’s an important difference between shortcuts and the Shortcuts app. As a system functionality, a shortcut is a convenient way to reopen or interact with a key feature of an app that the user has previously seen or completed. For instance, in iOS 12 you may see shortcuts for ordering a coffee on your way to work or playing a podcast playlist in a third-party app as you’re driving back home. App shortcuts are not workflows; they are the equivalent of a “point of interest” in an app that you can easily find again.
In Apple’s parlance, shortcuts are “donated” by apps to the system after a user performs an action in an app. Then, iOS 12 suggests shortcuts in Spotlight search results (where they appear as tappable items featuring the app’s icon and labeled action) and on the Lock screen as notification-like bubbles. In watchOS 5, you’ll see suggested shortcuts on the Siri watch face.
Shortcut suggestions are generated on-device by taking into account contextual triggers such as the time of day and day of week, the user’s location, and detected motion (such as walking or driving). Apple has been developing and refining their so-called Proactive technologies for years now1, but the Shortcuts project marks the company’s first foray into deep-linked app actions that react to user patterns and environmental triggers.
There are two types of app shortcuts. The first kind is a basic shortcut that opens a specific piece of content or section in an app. These simpler shortcuts are based on NSUserActivity, an API that Apple first introduced in iOS 8 to enable third-party apps to hand off activity to other devices and later expanded in iOS 9 to offer search results in Spotlight. The same API, with minimal tweaks on the developers’ side, is used in iOS 12 to provide users with shortcuts that launch apps into specific screens or activities. I expect developer adoption of shortcuts based on NSUserActivity to be massive when iOS 12 launches later this year; especially for apps that do not require execution of tasks inline within Siri/Spotlight/Lock screen, NSUserActivity should be enough.
According to Apple, while NSUserActivity enables basic integrations with apps, SiriKit Intents provide the best experience for users who want to accomplish more with shortcuts. This is not a new API either, but the way it’s surfaced throughout the system is.
SiriKit Intents have been around since iOS 10 as a way for developers to integrate their apps natively with the Siri voice experience and perform actions inline within the assistant. Apple launched SiriKit as a domain-based API designed for specific kinds of apps, and slightly improved it last year with the addition of visual code, list, and note-taking apps. Just as it was becoming clear that Apple’s narrow domain-specific approach couldn’t scale to thousands of apps that can’t be easily categorized, the company is turning SiriKit on its head.
In iOS 12, developers can now create their own custom intents based on built-in semantic templates; furthermore, existing SiriKit Intents can break out of the Siri UI and also work as shortcuts in other places such as Spotlight, the Lock screen, and even the Siri watch face. Apple’s approach isn’t surprising: if iOS apps can have the ability to perform tasks with custom interfaces and responses outside of the main app environment (as is currently possible with SiriKit Intents), why not expand the same functionality to other types of proactive assistance? With shortcuts, any essential, repeatable feature of an app can become an action that can be executed from anywhere on the system without launching the full app.
The idea of frequent usage and user routine is what separates intent-based shortcuts from traditional SiriKit voice interactions. For example, iOS 12 may learn that, on the way back home from work on Thursdays, you like to order pizza and have it delivered at 8 PM. Or that on an average workday around 1 PM, you open Things into your ‘Office’ project. These are repeatable actions that developers can turn into shortcuts with custom interfaces2 using the same underlying Intents technology first launched in iOS 10.
Developers who are planning to integrate with SiriKit in iOS 12 will have to consider whether users may want to execute actions from their apps elsewhere on the system; those who have shied away from integrating with SiriKit so far should probably look into custom intents now.
This new feature allows any app to offer a custom interface and custom responses that are used when the intent is invoked via Siri or shortcuts. To create a custom intent, developers can choose from building blocks that include verbs such as “do” or “order” and other templates; these actions essentially define how Siri talks about the task it’s executing. I’m excited about the prospect of any app becoming eligible for Siri integration; going forward, I expect Apple to continue expanding its custom intent technology as it may open up Siri to hundreds of thousands of new app integrations.
Even though the opposite may seem true, the shortcut features I’ve described so far do constitute a form of automation. Arguably, suggested shortcuts are system automations – actions to trigger a specific function that are conveniently presented at the best available time or location to anticipate users’ needs. Some old-school power users may disagree with me on this, but, more broadly, I consider Apple’s Proactive technologies – whether in surfacing a calendar event in an email message or ordering coffee while driving to work – a new kind of automation. Only time and developer adoption will tell if Apple’s bet is successful; conceptually speaking, I see suggested shortcuts as an effortless, almost invisible way to get users accustomed with the idea of actions that are automatically surfaced by the OS.
The line between system and user automation gets blurry once we start considering the second layer of Apple’s Shortcuts initiative: the ability for users to create custom phrases to launch shortcuts.
Add to Siri
Available in Settings ⇾ Siri & Search, iOS 12 features an option for users to define their own phrases for launching specific shortcuts via voice. This is done by speaking a custom phrase into a Siri recording UI that transcribes the command and creates a shortcut that can be invoked at any time. The Settings app automatically suggests recently used app shortcuts as well as other shortcuts that were previously “donated” by apps. Both recording a custom shortcut phrase and launching the phrase via Siri require an active Internet connection. Once given a custom phrase, user-configured shortcuts appear under the My Shortcuts section in Settings.
There are a few details worth noting about adding custom shortcut phrases to Siri. In their apps, third-party developers can embed messages and buttons (which they can design) to bring up the Siri UI to record a shortcut phrase. This means we’ll start seeing apps populate important screens or actions with suggestions and buttons to record a shortcut phrase. Moreover, in the Siri recording UI, developers can include a phrase suggestion, but it’s up to the user to decide what they want to record.
More importantly, users always have to create personalized shortcut phrases through direct interaction: apps cannot automatically fill the ‘My Shortcuts’ page in Settings with shortcuts and custom phrases. The user has to associate a custom phrase to a shortcut first.
The more I think about it, the more I see custom shortcut phrases as the next big step in making Siri a more personal assistant that is unique to each user. As would happen with an actual assistant, shortcut phrases allow users to form their own language over time, creating a personalized set of instructions that only their assistant can interpret and act upon. It’s the equivalent of jargon in a group of friends, but applied to Siri and app actions. The potential accessibility perks are tremendous too: Apple now enables everyone to create custom Siri phrases that can be however long or short they want; this removes the need to find actions nested in apps, multiple levels deep into their navigation stack.
Here’s why I believe Apple and the Workflow (now Shortcuts) team have been incredibly smart in reframing the concept of user automation around Siri and voice: when you think about it, custom phrases aren’t too dissimilar from keyboard shortcuts. However, spoken phrases are easier to remember – they don’t feel like dark magic to regular users who have never bothered with “automation” before, and, most of all, they are natively supported across the entire spectrum of Apple products, from iPhones and AirPods to HomePods and Watches.3
I strongly believe that personalized phrases are the first step towards changing the fundamental Siri experience, which is going to evolve into a personal command log – from one Siri to a million Siris, each uniquely tailored to the user who customized it. Furthermore, custom phrases reveal the third (and, for now, final) layer of Apple’s automation and Siri personalization strategy: the brand new Shortcuts app.
Custom Shortcuts and the Shortcuts App
The Shortcuts app, announced at WWDC last week, is the new version of Workflow. The app will not be part of iOS 12 itself; instead, it’ll be available on the App Store. In conversations I had last week, it appears that Apple’s goal is to offer full compatibility with existing workflows previously created in the Workflow app. My understanding is that Apple is very much aware of the fact that a sizable portion of the pro/prosumer community relies on Workflow to enhance their iOS experience in key ways; they don’t want to change that relationship for the worse. Very little if nothing should break in the transition to the Shortcuts app; in fact, I’m optimistic about the prospect of retaining all the actions from the original Workflow plus new ones created specifically for Shortcuts.
At first glance, Shortcuts looks like a cleaner, more intuitive version of Workflow designed for the modern iOS 12 design language. The app lets you create workflows – now referred to as “custom shortcuts” – with an editor that, just like the Workflow app, supports drag and drop to move actions from a library (organized in content types) into the shortcut editor. In the Shortcuts app, Apple removed the swipe gesture to navigate between the action library and editor in favor of a bottom panel that is reminiscent of Apple Maps and Stocks in iOS 12. A search field is always available at the bottom of the editor; tap on it, and you’ll be able to view all the actions Shortcuts offers. Abundant design refinements and new action panel aside, the editor’s core structure looks just like Workflow’s.
Despite the close resemblance, Shortcuts isn’t just a redesigned version of Workflow. And it’s also more than a glorified utility for people who want to geek out on an iOS device. As the “third layer” after suggested shortcuts and custom phrases, custom shortcuts are the most versatile tool for every iOS user who wants to deeply personalize Siri, automate sequences of actions, and, yes, even augment their productivity. The Shortcuts app aims to be a powerful blend of Workflow, app shortcuts, and Siri all rolled into one as a new take on personal assistants and iOS automation. It’s a bold idea that keeps what made Workflow unique while also opening it up to a broader user base and deeper system integrations.
There are a few key elements to consider. First, app shortcuts – the aforementioned actions donated by developers with NSUserActivity or SiriKit Intents – can be part of a custom shortcut created in the Shortcuts app. These shortcuts4 should either appear under Siri Suggestions or Apps in the action library. The inclusion of these actions in the Shortcuts app is a big deal: for the first time, users can create chains of actions that execute native third-party app commands without launching apps through URL schemes. Whether you want to launch an activity in an app or perform an action, these native actions won’t require you to write any code or talk to any web API – and in the case of intent-based actions, they will run inline within the Shortcuts app itself.
In a way, this is the first step toward the WorkflowKit framework I imagined last year – a solution for apps to be supported in an automation environment without the limitations and security concerns of URL schemes. What I couldn’t foresee was that Apple would reuse SiriKit for this. I don’t think these new integrations will obviate the need for more customizable URL scheme actions just yet (more on this below), but it’s a move in the right direction.
Perhaps more impressively, it seems that, upon assembling a custom shortcut, users will be able to choose to display native third-party app actions or not with a new ‘Show When Run’ toggle. If I were to guess, I’d say that this setting applies both to shortcuts being run in the Shortcuts app as well as the execution of multiple steps in Siri.
Which brings me to the second notable trait of the Shortcuts app: custom phrases. Just like shortcuts provided by apps to the system can be assigned a custom Siri invocation phrase, custom shortcuts from the Shortcuts app can be assigned a phrase to quickly trigger them from Siri.5 This ties into another key functionality of custom shortcuts: whenever possible, Siri on iOS 12 will try to run the multiple steps that comprise a custom shortcut inline, without launching the Shortcuts app; it’ll do so simply by moving down the sequence of actions and confirming results to the user.
This was demonstrated by Apple’s Kimberly Beverett at last week’s keynote: with a custom shortcut, Siri was able to send an iMessage to a contact, set a HomeKit scene, play audio in a third-party radio app, and open directions in Maps – all in a single request that ran contextually inside Siri. The demo showcased two powerful aspects of Shortcuts’ integration with Siri: background execution and the ability to skip confirmation steps in a series of automated actions. Just like in a traditional “workflow”, Siri completed multiple actions in a row, displayed a summary message, and only launched an app at the very end. I cannot even imagine all the advanced custom shortcuts I could build by mixing background execution with web APIs6, native app shortcuts, and Siri in the same sequence of actions.
Execution of multiple background tasks inside Siri feels to me like the final piece of the Workflow acquisition puzzle. If you consider all the layers I covered above – app shortcuts, custom phrases, and custom shortcuts – it’s clear that Apple wants to transform Siri into an assistant capable of not only handling a variety of app-related requests but, more importantly, bundles of sequential requests that are routinely requested by users. Of all the Shortcuts features I’ve seen so far, running whole sequences of steps in Siri is the one I’m most curious about.
More Shortcuts Details
Below, I’ve assembled a collection of details about shortcuts and the Shortcuts app that I was able to put together by rewatching the WWDC videos, as well as having lots of interesting conversations last week.
Magic Variables and Scripting actions are in. Two of Workflow’s most advanced functionalities will continue to be supported in Shortcuts and will likely play an essential role in the creation of complex chains of actions.
Magic Variables, introduced in Workflow 1.7, enable actions to continuously create variables behind the scenes, letting users dynamically convert them between types using the Content Graph engine. Magic Variables are unique to Workflow and they have dramatically changed how advanced users can chain actions together. Judging from Apple’s screenshots of the Shortcuts app, Magic Variables will be supported in custom shortcuts and users will also be able to include them as part of a response read aloud by Siri.
I’m also glad to see the inclusion of the Scripting category of actions in Shortcuts. Currently, the Scripting section of Workflow features options such as conditional blocks, repeat loops, getting data types and setting file names, and even fetching device details like battery and network information. All signs are pointing to Shortcuts retaining the same functionalities.
Shortcuts has an action extension for the share sheet. One of Workflow’s most powerful (and flexible) system integrations is the action extension that lets users run a workflow contextual to the app they’re in and the item they’re sharing. From what I hear, Apple plans to keep the same action extension for the Shortcuts app. As someone who regularly uses dozens of workflows activated from the extension in apps like Safari, Ulysses, and Photos, I look forward to adapting them to the Shortcuts extension.
The Gallery stays, but questions remain about public sharing. As confirmed by Apple, the Shortcuts app will have a built-in Gallery for users to browse and download hundreds of pre-made custom shortcuts for different tasks. Shortcuts will be organized in categories and curated by Apple on a regular basis. The Gallery isn’t new to the Workflow app, and it’s unclear if public sharing of user-created custom shortcuts will be part of it. Shortly after Apple acquired Workflow, the company removed the ability for users to share workflows directly to the gallery with public user profiles; it seems like Shortcuts will follow in the same footsteps with a Gallery limited to custom shortcuts created and promoted by Apple.
I am confident that the Shortcuts app will continue to offer ways for users to share custom shortcuts with each other, but I believe sharing won’t be tied to the Gallery, at least initially. That said, I would love to see user sharing return to the Gallery in a future update as a curated marketplace of custom shortcuts created by the community and vetted by Apple. I see tremendous potential in letting the iOS community natively extend Siri and iOS apps.
HomeKit integration. The Shortcuts app will support HomeKit; during the keynote, Apple demonstrated how a custom shortcut could set a thermostat to a specific temperature and toggle a fan on and off. We haven’t actually seen how HomeKit actions can be set up in Shortcuts yet though, so it’s not clear if HomeKit actions will let you configure individual characteristics of accessories with granular options. I wouldn’t be surprised if, for simplicity’s sake, the first version of Shortcuts only supports triggering existing HomeKit scenes.
There is a new Show Result action for Siri output. Judging from the slides shown at WWDC, this is the action that will let users craft custom Siri responses in the Shortcuts app. The Show Result action can be filled with Magic Variables and arbitrary plain text; when a custom shortcut is run via Siri, the assistant will speak the text contents of the action.
I can already imagine the possibilities this action opens up – such as the ability to end up with different Siri responses depending on the result of a conditional block in Shortcuts. I’m curious to know what happens to Show Result when a custom shortcut is run outside of the Siri UI though.
No support for passing input to a custom shortcut from Siri. My understanding is that triggering a custom shortcut from Siri won’t allow you to pass along a message as input text. For instance, if you have a custom shortcut that sends the input text to Ulysses and have associated the “Ulysses append” phrase to it, you won’t be able to say “Ulysses append Buy Nintendo Switch” to Siri and expect the “Buy Nintendo Switch” part to be sent to the Ulysses app.
From what I’ve been able to gather so far, Siri in iOS 12 doesn’t currently support the ability to pass an input message to a custom shortcut activated with a user phrase, and it’s unlikely to gain such functionality in the first version of iOS 12. This means that Siri will only be a launcher for custom shortcuts, not an actual interface that can pass user commands for specific parameters at runtime. There’s a lot of complexity involved in this, and I assume it is one of the next big features in the pipeline for Shortcuts.
Native app shortcuts don’t support custom input, output, and customizable fields. On a similar note, native app shortcuts based on SiriKit Intents that execute within the Shortcuts app can’t receive a custom input from previous actions. They also can’t set Magic Variables as custom output and don’t have customizable parameters. As shown in Apple’s screenshots of the Shortcuts app, native app actions are disconnected from every other step in a custom shortcut, which is likely going to limit their flexibility for advanced users.
URL scheme actions for third-party apps should remain available. In addition to manually launching URL schemes with the ‘Open URLs’ and ‘Open x-callback-url’ actions, Workflow has long offered a selection of built-in third-party app actions that are based on URL schemes but abstract that complexity with a visual module. For example, Bear, Things, and Ulysses come with native Workflow actions that can pass along custom parameters when launching the respective apps.
After talking to several developers at WWDC, it sounds like there’s a good chance third-party app actions powered by URL schemes should remain in the Shortcuts app as well. Personally, I think these will continue to be solid workarounds until SiriKit Intents are powerful and customizable enough to offer the same functionality of URL scheme actions. Ideally, in a future version of Shortcuts, these actions should be replaced by visual SiriKit Intents that can be customized with multiple variables and parameters by users. For now, it seems like traditional URL scheme actions will still allow for deeper customization and control than native app shortcuts.
Shortcuts has a widget with limited user interaction. As shown on Apple’s website, the Shortcuts app will keep the same widget that lets Workflow users run workflows from outside the app.
The widget is one of Workflow’s most peculiar features: it supports remote execution of workflows with basic interactivity, but it kicks you back to the main app if it comes across an action that can’t be completed from the widget, such as entering text with the keyboard or editing an image. The idea of running the same sequence of actions in different, more constrained environments brings me to…
Running custom shortcuts in Siri and audio-only contexts. Because Shortcuts has a widget, and because Apple said custom shortcuts will be offered on iOS, watchOS, HomePod7, and even CarPlay, I can then infer that the ability for the same shortcut to run in different contexts is expanding to audio and the watch face’s limited UI. Just like the widget, I assume this means a custom shortcut will completely execute within Siri (whether on HomePod or the Watch) unless an action requires manual user input. In that case, Siri would probably ask you to continue running the shortcut on your iPhone.
If this is the case (and I think my explanation is mostly accurate), I can imagine that custom shortcuts that embed actions such as ‘Choose from List’ or ‘Ask for Input’ will require jumping from Siri to the Shortcuts app. I would be completely okay with this as a first version. Generally speaking though, I’d love for Siri to give me multiple options to choose from a list, allow me to enter input via voice, and interact with a shortcut while it is executing in an audio-only context.
The new Play Media intent. Finally, speaking of audio: SiriKit in iOS 12 supports a new INPlayMedia intent which, as the name suggests, allows the assistant to play audio and video content from third-party apps. The intent can launch apps in the background (such as the radio station demo from the keynote) and supports playing audio on HomePod as well as suggestions on the Lock screen.8
In terms of app adoption and shortcut integrations, this intent should make it possible for the likes of Spotify and Overcast to offer users a way to play their favorite content via Siri just like they can with Apple Music and Podcasts. Overcast and other third-party podcast apps could, for instance, offer Siri shortcut buttons in various places of their UIs to let users record phrases such as “Play my Overcast queue” or “Let’s listen to Connected”; playback would then kick off immediately in the background and play through the device’s speakers, HomePod, or other devices. If my interpretation of this is correct, the combination of shortcuts and the new Media intent may alleviate a lot of the annoyances typically involved with using Siri and third-party media services.
The Future of Workflow and Siri
In iOS 12, Apple is providing users with a path from simple shortcuts to custom automation; both revolve around the guiding principle of letting users choose how they want Siri to get things done on their behalf. There is a progressive disclosure of automation happening from system shortcuts to the Shortcuts app: whether you’ve never dabbled with app shortcuts before or are a Workflow expert, iOS 12 wants to help you spend less time using your phone – a recurring theme this year – and let shortcuts do the work for you.
Shortcuts are going to be the feature for developers to adopt this summer. It was the talk of WWDC 2018 and, based on my conversations, developers are excited and optimistic about embedding SiriKit and shortcuts within their apps. In particular, custom phrases and custom SiriKit intents seem to be the most attractive proposition for developers who want to let users conveniently open and interact with their apps.
Shortcuts and custom phrases feel like the future of Siri and apps: they’re supported in every permutation of Siri and, most importantly, they let users develop their own language to access commonly used actions instead of forcing them to remember a syntax made of app names and verbs. Shortcuts, phrases, and custom intents feel like an app- and user-centric response to Alexa skills that Apple is uniquely positioned to build.
While shortcuts are a way to “sell” the idea of lightweight automation to regular users, the Shortcuts app is shaping up to be the automation powerhouse we were hoping to see following the Workflow acquisition. From what I’ve seen so far, the Shortcuts team has been able to build a cohesive narrative around basic shortcuts and custom ones, going far beyond what Workflow could have achieved as an independent app. I’m optimistic that heavy Workflow users won’t be disappointed by Shortcuts.
Even more than feature parity with Workflow though, I see integration of custom shortcuts with Siri as the next frontier for making automation accessible to more people. I believe this is what’s going to push automation forward as something more than “scripting” and other old preconceptions. Giving users the tools to create automations with drag and drop and easily trigger them with their voice is a remarkably powerful idea; it can turn Siri into a truly personal, programmable assistant capable of performing entire series of actions with just one request. I don’t think I’ve ever seen anything even remotely similar to Apple’s demo of the Shortcuts app and Siri integration on other platforms.
Some people, however, may argue that this isn’t real support for multiple commands in Siri; after all, you still have to create a custom shortcut with your favorite actions and manually set it up for Siri. And maybe the Shortcuts app is a way for Apple to circumvent the fact that Siri, unlike Google Assistant, isn’t capable of handling multiple questions in the same sentence yet. Ultimately however, it all goes back to whether you see the beauty and potential of user automation or not. With custom shortcuts, you won’t even have to speak entire sentences containing multiple requests every time you want to execute them; you can just tell Siri a short custom phrase and it’ll fire off multiple steps on its own.
From my perspective, this is exactly what automation is about: making a computer more useful, accessible, and faster so that we can save time to focus on something else. Custom shortcuts and the Shortcuts app show that not only is this vision still very much alive inside Apple, but it’s evolving with the times too.
- For example, the iPhone has long offered the ability to monitor significant location changes and Siri already supported proactive app recommendations. ↩︎
- As was previously the case, snippets of custom UIs built with SiriKit aren’t interactive: an app can bring up a restaurant menu as an animated or static image, but users can’t tap inside the Siri snippet to interact with it. ↩︎
- Unsurprisingly absent so far: macOS. ↩︎
- Yes, we are literally talking about shortcuts within custom shortcuts in the Shortcuts app, but bear with me here. ↩︎
- I believe this will be managed from the settings icon currently located in the top right of the shortcut editor. ↩︎
- If the Shortcuts app keeps all the features from Workflow, its existing support for web APIs means Siri in iOS 12 could be extended with basic integrations for any web service with an API. That would be pretty wild. ↩︎
- On HomePod, shortcuts will be available via Personal Requests. ↩︎
- There is even a new API for developers of media apps to recommend new episodes of periodic content (like TV shows and podcasts) to users alongside suggested shortcuts. ↩︎