Today Cochlear introduced a new cochlear implant sound processor that serves as the first such device directly compatible with iOS devices. The company’s press release states:
With the Nucleus 7 Sound Processor, people with a Cochlear Nucleus Implant can now stream sound directly from a compatible iPhone, iPad and iPod touch directly to their sound processor. They will also be able to control, monitor and customize their hearing on their iPhone or iPod touch through the Nucleus® Smart App available to download for free from the App Store®.
The Nucleus Smart app also includes a feature resembling Apple’s ‘Find My iPhone’ called ‘Find My Processor.’ Especially helpful for children who may be more prone to losing their sound processor, this feature employs an iPhone’s built-in location services to determine the last place the processor was connected to its paired iPhone.
Sarah Buhr of TechCrunch notes that today’s announcement is the fruit of a lengthy period of research and development within Apple in response to the growing issue of hearing loss.
Apple…has spent a number of years developing a hearing aid program within the company. Apple soon developed a protocol the company offered for free for hearing aid and implant manufacturers to use with their devices.
Today Microsoft introduced a new app exclusively for iPhone, Seeing AI. This app is designed as a tool for the low vision community; using the iPhone’s camera and its AI smarts, Seeing AI converts the visual experience of the world into an audible one. As you point the camera at things in the world around you, the app will describe that world in a quick, informative manner.
From a user’s perspective, the app is tremendously simple to use; there’s very little that needs to be done before Seeing AI can begin describing the space around you. If you want to identify people, you can first set them up as recognizable from the sidebar menu’s ‘Face Recognition’ option. Otherwise, all you have to do to start identifying things is select from one of five different categories (the app calls them ‘channels’) to help the app understand what type of object it needs to identify. The five current categories are:
- Short Text
- Scene (currently tagged as ‘Beta’)
Microsoft says a category for currency will be coming soon, allowing the app to intelligently identify different denominations of cash.
In my testing of the app, it’s far from perfect in its ability to identify things, but it has done a solid job all-around. Though the tech driving the app may only be experimental and have a long way to go, the app is far from barebones in what it can do now. When identifying a document, Seeing AI will audibly guide you through the capture process to help you get the full document in view. After scanning a product’s barcode, in some cases you’ll receive additional information about the product beyond just its name. And if the app is scanning a person, it can even describe a best guess at their visible emotional state. It’s an impressive, deep experience that nevertheless remains dead simple to operate.
Even if you aren’t in the market for Seeing AI yourself, it’s a fascinating product worth checking out, and it’s entirely free. You can download it on the App Store.
Microsoft has a short introductory video that gives a great taste of all that the app can do, embedded below.
Great overview by Steven Aquino on the Accessibility changes coming with iOS 11. In particular, he’s got the details on Type to Siri, a new option for keyboard interaction with the assistant:
Available on iOS and the Mac, Type to Siri is a feature whereby a user can interact with Siri via an iMessage-like UI. Apple says the interaction is one-way; presently it’s not possible to simultaneously switch between text and voice. There are two caveats, however. The first is, it’s possible to use the system-wide Siri Dictation feature (the mic button on the keyboard) in conjunction with typing. Therefore, instead of typing everything, you can dictate text and send commands thusly. The other caveat pertains to “Hey Siri.” According to a macOS Siri engineer on Twitter, who responded to this tweet I wrote about the feature, it seems Type to Siri is initiated only by a press of the Home button. The verbal “Hey Siri” trigger will cause Siri to await voice input as normal.
Technicalities aside, Type to Siri is a feature many have clamored for, and should prove useful across a variety of situations. In an accessibility context, this feature should be a boon for deaf and hard-of-hearing people, who previously may have felt excluded from using Siri due to its voice-first nature. It levels the playing field by democratizing the technology, opening up Siri to an even wider group of people.
I wish there was a way to switch between voice and keyboard input from the same UI, but retaining the ‘Hey Siri’ voice activation seems like a sensible trade-off. I’m probably going to enable Type to Siri on my iPad, where I’m typing most of the time anyway, and where I could save time with “Siri templates” made with native iOS Text Replacements.
Early last year, James Rath, a young filmmaker who was born legally blind, created a video about the impact Apple products have had on his life. That video caught the attention of Apple:
In the ensuing months, Rath’s YouTube career has taken off and he’s become a strong advocate for the blind.
To mark Global Accessibility Awareness Day, Tim Cook spoke with Rath and two other YouTubers, Rikki Poynter and Tatiana Lee about accessibility. Cook and Poynter, who is deaf, discussed closed captioning and how accessibility is a core value at Apple. Lee talked to Cook about the Apple Watch and its ability to track wheelchair use. Rath and Cook explored the history of Apple’s commitment to accessibility and the democratization of technology. The interviews follow the release of a series of videos made by Apple spotlighting the accessibility features of its products.
The interviews, which were filmed in the courtyard at Apple’s Infinite Loop campus are available after the break.
Katie Dupere writes for Mashable about the stories shared in a new series of Apple videos:
Meera is nonverbal, living with a rare condition called schizencephaly that impacts her ability to speak. But with the help of her iPad and text-to-speech technology, she can make her thoughts and opinions known — and she sure does. From her love of Katy Perry to her passion for soccer, Meera will let you know exactly what’s on her mind. All it takes is a few taps of her tablet, and with a specialized app stringing letters into words, and words into phrases, her thoughts are played out loud.
Meera’s relationship with tech is just one of seven stories featured in a powerful video series created by Apple to spotlight the company’s dedication to accessible technology. The videos were released in celebration of Global Accessibility Awareness Day on May 18, a day emphasizing the importance of accessible tech and design.
Accessibility features have long been prioritized in Apple’s software, and this new video series tells the stories of people who depend on those features. What to some may simply be an ignored option in the Settings app is to others a pathway to significant new experiences and empowerment.
Some interesting thoughts about the AirPods by Steven Aquino. In particular, he highlights a weak aspect of Siri that isn’t usually mentioned in traditional reviews:
The gist of my concern is Siri doesn’t handle speech impediments very gracefully. (I’ve found the same is true of Amazon’s Alexa, as I recently bought an Echo Dot to try out.) I’m a stutterer, which causes a lot of repetitive sounds and long breaks between words. This seems to confuse the hell out of these voice-driven interfaces. The crux of the problem lies in the fact that if I don’t enunciate perfectly, which leaves several seconds between words, the AI cuts me off and runs with it. Oftentimes, the feedback is weird or I’ll get a “Sorry, I didn’t get that” reply. It’s an exercise in futility, sadly.
Siri on the AirPods suffers from the same issues I encounter on my other devices. It’s too frustrating to try to fumble my way through if she keeps asking me to repeat myself. It’s for this reason that I don’t use Siri at all with AirPods, having changed the setting to enable Play/Pause on double-tap instead (more on this later). It sucks to not use Siri this way—again, the future implications are glaringly obvious—but it’s just not strong enough at reliably parsing my speech. Therefore, AirPods lose some luster because one of its main selling points is effectively inaccessible for a person like me.
That’s a hard problem to solve in a conversational assistant, and exactly the kind of Accessibility area where Apple could lead over other companies.
Apple opened what will in all likelihood be its last event in Town Hall at One Infinite Loop in Cupertino with a video highlighting the importance of accessibility features built into its products. In addition to a video, Apple has created a separate webpage highlighting the accessibility. The page includes videos highlighting wheelchair workouts on the Apple Watch, switch controls on the Mac, Live Listen designed for the hearing impaired, VoiceOver, and Speak Screen.
You can also follow all of the MacStories coverage of today’s Apple’s keynote through our October 27 Keynote hub, or subscribe to the dedicated October 27 Keynote RSS feed.
The impact of the accessibility features built into iOS cannot be understated. Accessibility has opened doors to computing that were previously shut to many people with disabilities. With iOS 10 launching soon, Steven Aquino takes a look at iOS accessibility milestones in a guest post on 512 Pixels that focuses on five key features: VoiceOver, Guided Access, Large Dynamic Type, Switch Control, and Magnifier.
It feels like VoiceOver has been around forever, so it was interesting to be reminded that it didn’t debut until the introduction of the iPhone 3GS and has its roots in the short-lived buttonless iPod Shuffle. Of the other accessibility feature covered, the one that will probably be the least familiar to most readers is Magnifier, which is coming in iOS 10. Aquino believes that when we look back at iOS 10 in the future, Magnifier will be viewed as one of the greatest enhancements to iOS:
The reason I’m so effusive about Magnifier is the handiness of it. So often, I’m reading a restaurant menu or looking at price tags in the grocery store, and the print in set in small font. Where previously I would strain my eyes in order to see, now all I need to do is pull out my phone and triple-press the Home button to launch Magnifier.
Steven has written a lot about the accessibility of Apple products, including here at MacStories, and does a wonderful job putting each accessibility milestone into historical context. I encourage everyone to read the full article on 512 Pixels.
Mashable published a profile of Jordyn Castor, a 22-year-old Apple engineer on its accessibility design and quality team. Castor, who has been blind since birth, joined Apple full-time last year after interning there in college. As she explains to Mashable:
“I realized then I could code on the computer to have it fulfill the tasks I wanted it to,” says Castor, whose current work focuses on enhancing features like VoiceOver for blind Apple users. “I came to realize that with my knowledge of computers and technology, I could help change the world for people with disabilities.
In addition to speaking with Castor, Mashable interviewed Sarah Herrlinger, senior manager for global accessibility policy and initiatives, who explained Apple’s approach to accessibility:
a notable part of the company’s steps toward accessibility is its dedication to making inclusivity features standard, not specialized. This allows those features to be dually accessible — both for getting the tech to more users, as well as keeping down costs.
Mashable’s piece is a must-read for Jordyn Castro’s inspiring personal story and its insight into the unique way Apple approaches accessibility. Instead of treating accessibility as a special, add-on feature that is purchased separately, it’s built into the operating system itself. That, combined with the work Apple has done to make accessibility features easy for developers to adopt, reveals a pragmatic approach designed to encourage broad-based adoption, making Apple’s accessibility innovations available to as many people who need them as possible.