This Week's Sponsor:

Kolide

Ensure that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "siri"

Apple and the Alexa Ecosystem

I recently read two interesting takes on the ever-growing Alexa ecosystem as it relates to Apple that made me think about the future of Siri and HomeKit. Here’s M.G. Siegler on Amazon’s plan to put Alexa everywhere:

The Echo Dot was the number one selling device across all of Amazon during the holiday shopping season. (The Fire TV stick with the Alexa-enabled remote was the second-most popular product.) Again, no absolute sales numbers beyond “tens of millions of Alexa-enabled devices” — more than we usually get, by the way — but no matter: tens of millions is impressive enough.

I’ve been thinking about this recently not just in the context of putting Echoes in hotels, but also relative to Apple. As we’re all well aware, Apple had to delay their foray into the space, the HomePod, into 2018. But not only did they miss the all-important holiday shopping season, I’m increasingly thinking that they may have missed the boat.

Believe me, I know how dangerous this line of thinking is with regard to Apple. Apple is almost never the first-mover in a market. Instead, they prefer to sit back and let markets mature enough to then swoop in with their effort, which more often than not is the best effort (this is both subjective in terms of my own taste, and often objective in terms of sales). But again, I increasingly don’t believe that this will be the case with their smart speaker.

Amazon has entered the speaker and home automation market with Alexa-enabled devices in two ways: first with their own Echo products, then with a growing roster of third-party manufacturers that are baking Alexa into their devices and almost treating Amazon’s assistant as a “standard” feature like WiFi or Bluetooth. There’s a fascinating parallel between Amazon Web Services – a suite of components embedded in the majority of modern websites and web apps – and Alexa Voice Service – a suite of voice APIs now embedded in hundreds of automation devices, general-purpose accessories and appliances, and web services.

Here’s Ben Bajarin on what Alexa’s presence at CES tells us about the ecosystem surrounding Apple:

While many Apple defenders want to dismiss the momentum we are observing with the Amazon ecosystem on display here at CES, while Amazon is similarly not present just like Apple, I believe it is a mistake to do so.

It is easy to say that because Apple was never present at CES that the show didn’t mean something to them or their ecosystem. It is easy, and correct to say that CES was not, or never was, a measure of the health of Apple’s products. It is, however, incorrect and dangerous to miss that CES had been, for some time, a barometer for the health of Apple’s ecosystem.

As I mentioned, our ability to measure any platforms ecosystem from what we observe at CES, is the main reason so many are paying attention to what is happening with Amazon’s Alexa platform. Google Assistant is certainly more present than it was last year, however, when you look at how third parties are talking about-and marketing-their support of these assistants they are putting significantly more effort into talking about Alexa than Google Assistant. Which is a telling signal. Again, to reiterate this point, third parties used to market, and spend energy talking about their integration with iOS or support of iPhone/iPad with the same rigor they are now talking about Amazon’s Alexa. This can not be ignored.

You could argue that most Apple-compatible gadgets and accessories announced at CES used to appear in tech blogs only to be forgotten a few months later because they were fads, vaporware, or ultimately not essential to the growth of the iOS ecosystem, and that the same will happen with Alexa-enabled devices we’ve seen this year. The difference, I think, is that this new generation of home automation products is an ecosystem in itself with higher value than, say, the iPad keyboards or stylii we used to see at CES. Alexa hasn’t “won”, but it has momentum among third-party companies making products that are or will soon be in our homes, sharing the same space of our TVs, routers, consoles, and mobile devices.

Read more


SiriKit Support Will Be Included on HomePod for Certain Third-Party App Domains

In an update on its developer site today, Apple announced that SiriKit support will be included in the forthcoming HomePod device.

iOS 11.2 introduces SiriKit for HomePod, the powerful speaker that sounds amazing, adapts to wherever it’s playing, and provides instant access to Apple Music. HomePod is also a helpful home assistant for everyday questions and tasks. With the intelligence of Siri, users control HomePod through natural voice interaction. And with SiriKit, users can access iOS apps for Messaging, Lists, and Notes. Make sure your SiriKit integration is up to date and test your app’s voice-only experience today.

When the HomePod was first unveiled at WWDC in June, there was no word regarding whether it would support third-party apps through SiriKit. While some expected further details on HomePod to arrive in September’s keynote, that event came and went with no news. Today marks the first time we’ve learned anything new about HomePod from official sources since its initial introduction.

Messaging, Lists, and Notes are the only SiriKit domains that will work with HomePod at launch, but it’s likely we’ll see more added over time. This means the HomePod will be able to do things like send messages in Telegram, create tasks in Todoist or Things, and create notes in Evernote.

Information about how SiriKit on HomePod will work is included on Apple’s full SiriKit site, which contains the following detail:

Siri recognizes SiriKit requests made on HomePod and sends those requests to the user’s iOS device for processing.

This means SiriKit support on HomePod will take a fundamentally different path than that of competing products like the Amazon Echo. In most cases the Echo’s set of third-party Alexa skills can be operated completely independent of another device. For the time being at least, third-party support on HomePod will rely on a nearby iPhone or iPad.

There are still many HomePod details that remain unclear. For example, we don’t yet know if SiriKit requests or iCloud requests will be tied to a single user’s device, or if a family will be able to interact with the device in a user-specific way. Time will bring the full story into view, but for now, knowing that HomePod will launch with at least some third-party app support is good news.


Apple Defends Siri Explaining How Its Voice Assistant Works

On the heels of a feature story in Wired last week, Apple executives and engineers opened up about how Siri works in interviews with Fast Company. As the publication explained it, a narrative has emerged that Apple’s AI work is behind other companies’ efforts because of its dedication to user privacy.

In an interview with Fast Company, Apple’s Greg Joswiak disagrees:

“I think it is a false narrative. It’s true that we like to keep the data as optimized as possible, that’s certainly something that I think a lot of users have come to expect, and they know that we’re treating their privacy maybe different than some others are.”

Joswiak argues that Siri can be every bit as helpful as other assistants without accumulating a lot of personal user data in the cloud, as companies like Facebook and Google are accustomed to doing. “We’re able to deliver a very personalized experience … without treating you as a product that keeps your information and sells it to the highest bidder. That’s just not the way we operate.”

The article provides concrete examples of how Siri works and the advances that have been made since it was introduced with a level of detail that has not been shared before.

The effectiveness of Siri and Apple’s machine learning research is an area where Apple’s culture of secrecy has hurt it. Apple seems to have recognized this and has made a concerted effort to turn perceptions around with interviews like the ones in Wired and Fast Company. Apple employees have also begun to engage in more public discussion of the company’s machine learning and AI initiatives through outlets like its recently-introduced journal and presentations made by Apple employees. Apple even enlisted The Rock to help it get the word out about Siri’s capabilities. Competition for virtual personal assistant supremacy has heated up, and Apple has signaled it has no intention of being left out or backing down.

Permalink

Making Siri More Human

David Pierce has a feature story on WIRED today that’s all about Siri – especially the new Siri voice coming in iOS 11. It features a variety of interesting details concerning Siri’s history, the way Apple thinks about the digital assistant, and in-depth details on how new Siri languages are added.

One of my favorite bits involves a quote from Apple’s VP of product marketing, Greg Jozwiak, who said Apple focuses on Siri’s ability to get things done:

It drives him crazy that people compare virtual assistants by asking trivia questions, which always makes Siri look bad. “We didn’t engineer this thing to be Trivial Pursuit!” he says.

This explains Siri’s productivity-focused commercial starring The Rock, and also helps make sense of the fact that Siri is often embarrassingly clueless when it comes to current events or other simple queries. Though Apple’s awareness of the problem exacerbates its lack of a suitable response in beefing up Siri’s trivia knowledge.

Other interesting tidbits from the story include the fact that Siri now has a massive 375 million active monthly users, and that Siri’s new, more natural voice was inspired in part by the movie Her.

Permalink

Siri’s Co-Creator Shares His Vision for the Future of AI

Earlier this year Tom Gruber, the co-creator of Siri and current member of Apple’s AI team, gave a TED talk focusing on his vision for the future of AI, which is rooted in a philosophy he calls “humanistic AI.” The video and full transcript for that talk recently became available, providing a broader audience with Gruber’s insights into the place of AI in our everyday lives. While he doesn’t offer any specifics regarding work Apple is doing in this space, it is clear that Gruber’s vision represents, at least in part, the vision of Apple for Siri and AI as a whole.

Gruber describes humanistic AI as “artificial intelligence designed to meet human needs by collaborating and augmenting people.” This theme of AI augmenting, complementing humans is fleshed out by Gruber in several ways; one example involves Siri serving as an accessibility tool, while another theorizes at the benefits AI can offer to the human memory. The full talk provides an interesting glimpse into how Apple sees AI evolving in the near future.

Permalink

Siri Featured in Apple Ad Starring The Rock

Apple released an advertisement showcasing Siri starring former pro-wrestler turned film star, Dwayne (‘The Rock’) Johnson. Teased yesterday by Johnson on Twitter and Facebook, the video, posted to Apple’s YouTube channel, features Johnson accomplishing a long list of life goals with the help of Siri during a single day. The tongue-in-cheek spot highlights several Siri features such as:

  • reading Johnson’s schedule;
  • creating a reminder;
  • scheduling a Lyft ride;
  • getting the weather forecast;
  • reading email;
  • displaying photos;
  • texting someone;
  • converting measurements;
  • playing a playlist;
  • starting a FaceTime call; and
  • taking a selfie.

The Siri ad is a clever and entertaining way of explaining the breadth of tasks that can be accomplished with Siri, from the basics like weather forecasts to less well-known features like taking a selfie.


iOS 11 and Accessibility

Great overview by Steven Aquino on the Accessibility changes coming with iOS 11. In particular, he’s got the details on Type to Siri, a new option for keyboard interaction with the assistant:

Available on iOS and the Mac, Type to Siri is a feature whereby a user can interact with Siri via an iMessage-like UI. Apple says the interaction is one-way; presently it’s not possible to simultaneously switch between text and voice. There are two caveats, however. The first is, it’s possible to use the system-wide Siri Dictation feature (the mic button on the keyboard) in conjunction with typing. Therefore, instead of typing everything, you can dictate text and send commands thusly. The other caveat pertains to “Hey Siri.” According to a macOS Siri engineer on Twitter, who responded to this tweet I wrote about the feature, it seems Type to Siri is initiated only by a press of the Home button. The verbal “Hey Siri” trigger will cause Siri to await voice input as normal.

Technicalities aside, Type to Siri is a feature many have clamored for, and should prove useful across a variety of situations. In an accessibility context, this feature should be a boon for deaf and hard-of-hearing people, who previously may have felt excluded from using Siri due to its voice-first nature. It levels the playing field by democratizing the technology, opening up Siri to an even wider group of people.

I wish there was a way to switch between voice and keyboard input from the same UI, but retaining the ‘Hey Siri’ voice activation seems like a sensible trade-off. I’m probably going to enable Type to Siri on my iPad, where I’m typing most of the time anyway, and where I could save time with “Siri templates” made with native iOS Text Replacements.

Permalink

Dispelling the Apple Services Myth

Apple is known for its quality hardware and software, but services are another story.

Cloud-based services are the future – there’s no denying that. And Apple historically has struggled with its cloud offerings. From MobileMe, to the early growing pains of iCloud, to the Apple Maps fiasco, the company gained a poor reputation in the area of services.

Only in the last two years has Apple publicly touted services as a core part of its business. Company press releases as recent as May 2015 ended with the following self-definition:

Apple designs Macs, the best personal computers in the world, along with OS X, iLife, iWork and professional software. Apple leads the digital music revolution with its iPods and iTunes online store. Apple has reinvented the mobile phone with its revolutionary iPhone and App Store, and is defining the future of mobile media and computing devices with iPad.

There’s a lot that feels outdated here, including the fact that both Mac and iPod are highlighted before the iPhone. But one major way this paragraph fails to describe the Apple of today is that the word ‘services’ is nowhere to be found.

Amid a variety of other changes, Apple’s current self-definition includes the following:

Apple’s four software platforms — iOS, macOS, watchOS and tvOS — provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store, Apple Music, Apple Pay and iCloud.

Services are a key component of modern Apple. The way the company defines itself, along with the numerous services shoutouts in quarterly earnings calls, prove that.

Despite Apple’s increased focus on services, the common narrative that the company “can’t do services” still hangs around – in online tech circles at least.

But is that narrative still true, or has it grown outdated?

I want to share how I use Apple services in my everyday life across three important contexts of life:

  • As I work,
  • On the go, and
  • Around the house.

My aim is not to perform an in-depth comparison of Apple’s cloud offerings and competing products. Though competitors and their features will come up occasionally, the focus here is on my experiences in everyday living – my experiences, not yours. I understand that just because something does or doesn’t work for me, the same isn’t necessarily true for you. The point of this piece is not to try proving anything; instead, I simply want to assess and share my current experiences with Apple’s services.

Read more


The Way Siri Learns New Languages

Stephen Nellis, writing for Reuters, shares an interesting look into Apple’s method for teaching Siri a new language:

At Apple, the company starts working on a new language by bringing in humans to read passages in a range of accents and dialects, which are then transcribed by hand so the computer has an exact representation of the spoken text to learn from, said Alex Acero, head of the speech team at Apple. Apple also captures a range of sounds in a variety of voices. From there, an acoustic model is built that tries to predict words sequences.

Then Apple deploys “dictation mode,” its text-to-speech translator, in the new language, Acero said. When customers use dictation mode, Apple captures a small percentage of the audio recordings and makes them anonymous. The recordings, complete with background noise and mumbled words, are transcribed by humans, a process that helps cut the speech recognition error rate in half.

After enough data has been gathered and a voice actor has been recorded to play Siri in a new language, Siri is released with answers to what Apple estimates will be the most common questions, Acero said. Once released, Siri learns more about what real-world users ask and is updated every two weeks with more tweaks.

The report also shares that one of Siri’s next languages will be Shanghainese, a dialect of Wu Chinese spoken in Shanghai and surrounding areas. This addition will join the existing 21 languages Siri currently speaks, which are localized across a total of 36 different countries.

Debating the strengths and weaknesses of Siri has become common practice in recent years, particularly as competing voice assistants from Amazon, Google, and Microsoft have grown more intelligent. But one area Siri has long held the lead over its competition is in supporting a large variety of different languages. It doesn’t seem like Apple will be slowing down in that regard.

Permalink