Apple Inc. has purchased Emotient Inc., a startup that uses artificial-intelligence technology to read people’s emotions by analyzing facial expressions.
It isn’t clear what Apple plans to do with Emotient’s technology, which was primarily sold to advertisers to help assess viewer reactions to their ads. Doctors also have tested it to interpret signs of pain among patients unable to express themselves, and a retailer used it to monitor shoppers’ facial expressions in store aisles, the company had said.
As I argued in a section of my iOS 9 review last year, my experience with Apple services is that, when it comes to intelligence, they've consistently been less proactive and slower than Google's. Two examples: I can search for photos by subject in Google Photos, and the Google mobile app sends me time to leave alerts that actually make sense.
The differences in intelligence between Google and Apple come with separate sets of trade-offs. The question for Apple should be: are there more ways to leverage AI to provide useful services while still prioritizing user privacy? How can Siri and iOS' Intelligence features expand without comprising on Apple's vision? Is that even possible without having to rely on cloud-based deep learning for user data in the long term? Is Apple considering new approaches that are somewhat in the middle?
It's from such standpoint that I consider Apple's AI acquisitions (Perceptio, VocallQ, and now Emotient), and it'll be interesting to see what iOS 10 holds in this area.