What should a wrist computer ideally do for you?
Telling the time is a given, and activity tracking has become another default inclusion for that category of gadget. But we're talking about a computer here, not a simple watch with built-in pedometer. The device should present the information you need, exactly when you need it. This would include notifications to be sure, but also basic data like the weather forecast and current date. It should integrate with the various cloud services you depend on to keep your life and work running – calendars, task managers, and the like. It doesn't have to be all business though – throwing in a little surprise and delight would be nice too, because we can all use some added sparks of joy throughout our days.
Each of these different data sources streaming through such a device presents a dilemma: how do you fit so much data on such a tiny screen? By necessity a wrist computer's display is small, limiting how much information it can offer at once. This challenge makes it extremely important for the device to offer data that's contextual – fit for the occasion – and dynamic – constantly changing.
Serving a constant flow of relevant data is great, but a computer that's tied to your wrist, always close at hand, could do even more. It could serve as a control center of sorts, providing a quick and easy way to perform common actions – setting a timer or alarm, toggling smart home devices on and off, adjusting audio playback, and so on. Each of these controls must be presented at just the right time, custom-tailored for your normal daily needs.
If all of this sounds familiar, it's because this product already exists: the Apple Watch. However, most of the functionality I described doesn't apply to the average Watch owner's experience, because most people use a watch face that doesn't offer these capabilities – at least not many of them. The Watch experience closest to that of the ideal wrist computer I've envisioned is only possible with a single watch face: the Siri face.
Before I get any further, let me tell you that some of what I’m going to say here was already covered by David Sparks in this post from almost six years ago. This was just a year and a half after the “beta” introduction of Siri with the iPhone 4S, and David was pleased with what Siri could do. I like a lot of what Siri can do with dates, too, but there are still some frustrating blind spots and inconsistencies. In fact, with one of David’s examples, Siri isn’t as convenient as it was six years ago.
Context has always been one of Siri’s weaknesses, and that’s where it failed Casey. Any normal human being would understand immediately that a question asked in January about days since a day in December is talking about the December of the previous year. But Siri ignores (or doesn’t understand) the word “since” and calculates the days until the next December 18.
Solid collection of examples of date calculations with Siri by Dr. Drang. As he notes, it's not that Siri can't answer complex questions involving dates – it's that you often have to phrase your questions with an exact syntax that a computer program can understand. This is frustrating because Apple promotes Siri as a smart assistant that can infer context without a refined syntax. I still run into a similar problem with time zone conversions; of course, the old trick I used to rely on no longer works for me unless I preface the question with "Ask Wolfram".
Today Apple announced that one of its most recent high profile hires, John Giannandrea, has been added as the twelfth member of the company's executive team. His title is now Senior Vice President of Machine Learning and AI Strategy. From the press release:
“John hit the ground running at Apple and we are thrilled to have him as part of our executive team,” said Tim Cook, Apple’s CEO. “Machine learning and AI are important to Apple’s future as they are fundamentally changing the way people interact with technology, and already helping our customers live better lives. We’re fortunate to have John, a leader in the AI industry, driving our efforts in this critical area.”
News of Giannandrea's hiring at Apple first broke in April at The New York Times. Apple didn't formally announce the hire, however, until July. And here we are just a few short months later, with another press release from Apple announcing his promotion.
Giannandrea's role involves leadership of Siri, machine learning, and other artificial intelligence projects, all of which are right up his wheelhouse due to his former role as Google's chief of search and artificial intelligence. While it's hard to say from the outside what kind of difference his influence is making at Apple, this move is a good sign that the company's pleased with his early months of work. Perhaps we'll get to see the fruits of his labors at WWDC 2019.
Apple’s online Machine Learning Journal has published a paper on the methodologies the HomePod uses to implement Siri functionality in far-field settings. As Apple’s Audio Software Engineering and Siri Speech Teams explain:
Siri on HomePod is designed to work in challenging usage scenarios such as:
- During loud music playback
- When the talker is far away from HomePod
- When other sound sources in a room, such as a TV or household appliances, are active
Each of those conditions requires a different approach to effectively separate a spoken Siri command from other household sounds and to do so efficiently. The report notes that the HomePod’s speech enhancement system uses less than 15% of one core of a 1.4 GHz A8 processor.
Apple engineers tested their speech enhancement system under a variety of conditions:
We evaluated the performance of the proposed speech processing system on a large speech test set recorded on HomePod in several acoustic conditions:
- Music and podcast playback at different levels
- Continuous background noise, including babble and rain noise
- Directional noises generated by household appliances such as a vacuum cleaner, hairdryer, and microwave
- Interference from external competing sources of speech
In these recordings, we varied the locations of HomePod and the test subjects to cover different use cases, for example, in living room or kitchen environments where HomePod was placed against the wall or in the middle of the room.
The paper concludes with examples of filtered and unfiltered audio from those HomePod tests. Regardless of whether you’re interested in the details of noise reduction technology, the sample audio clips are worth a listen. It’s impressive to hear barely audible commands emerge from background noises like a dishwasher and music playback.
If you're at all like me, then now is the time of year you receive the most packages. In the midst of the holiday season, the usual assortment of deliveries my wife and I have coming for ourselves are joined by all the gifts we've purchased for others. It couldn't be better timing, then, that my package tracker of choice, Deliveries, has added Siri shortcut support today.
Google released an exciting update for its Assistant iOS app today, bringing support for Siri shortcuts and, for the first time, opening lines of communication between the two competing assistants.
Siri and the Google Assistant have historically been unable to work together in any way, but thanks to the opening up of Siri via shortcuts in iOS 12, that changes now. With the latest update, you can set up a shortcut in iOS to immediately, via Siri, trigger any command you'd like to give Google's Assistant.
Evgeny Cherpak’s iOS app, Remote Control for Mac, has been updated with Siri shortcut support, which opens up some interesting ways to control a Mac with shortcuts. I’ve been using the app’s new Siri shortcuts for about a week and, as I covered on AppStories today, the shortcuts I’ve created that incorporate Remote’s functionality are already ones that I use every day.
I previously covered HomeCam, a HomeKit utility by indie developer Aaron Pearce, as a superior way to watch live video streams from multiple HomeKit cameras. In addition to a clean design and straightforward approach (your cameras are displayed in a grid), what set HomeCam apart was the ability to view information from other HomeKit accessories located in the same room of a camera and control nearby lights without leaving the camera UI. Compared to Apple's approach to opening cameras in the clunky Home app, HomeCam is a nimble, must-have utility for anyone who owns multiple HomeKit cameras and wants to tune into their video feeds quickly. With the release of iOS 12, HomeCam is gaining one of the most impressive and useful implementations of Siri shortcuts I've seen on the platform yet.
Among the actions that didn't make the transition from Workflow to the new Shortcuts app for iOS 12, built-in support for triggering IFTTT applets (formerly known as "recipes") is perhaps the most annoying one. With just a few taps, Workflow's old 'Trigger IFTTT Applet' action allowed you to assemble workflows that combined the power of iOS integrations with IFTTT's hundreds of supported services. The IFTTT action acted as a bridge between Workflow and services that didn't offer native support for the app, such as Google Sheets, Spotify, and several smart home devices.
Fortunately, there's still a way to integrate the just-released Shortcuts app with IFTTT. The method I'm going to describe below involves a bit more manual setup because it's not as nicely integrated with Shortcuts as the old action might have been. In return however, you'll unlock the ability to enable IFTTT triggers using Siri on your iOS devices, Apple Watch, and HomePod – something that was never possible with Workflow's original IFTTT support. Let's take a look.