Like all Apple products, macOS is an accessible platform. Blind and low vision users can navigate their Mac using VoiceOver, while someone who has physical motor delays can use Switch Control to edit videos in Final Cut. And under the Accessibility pane in System Preferences, there is a multitude of other features one can use, ranging from Zoom to Invert Colors to closed-captioning and more. Whatever your need, the breadth and depth of Apple’s accessibility software spans many domains. This is why Apple is lauded as the industry leader in accessibility: the tools run deep and they’re well-designed.
Still, accessibility on macOS doesn’t quite reach feature parity with iOS. Amidst rumors that Apple is working on a cross-platform set of APIs to bridge the company’s two primary operating systems, now is an opportune time to consider what each platform does and what they offer one another.
In the context of accessibility, the way Apple brings consistency between iOS and macOS is by sharing features and technologies among the two. As such, there are some iOS-first features macOS sorely needs, while the Mac offers things iOS would benefit from as well. Such enhancements would not only improve the user experience across devices, but also would make iOS and Mac software richer, fuller products overall. And most importantly, more accessible.
Apple has updated its homepage and accessibility webpage to celebrate Global Accessibility Awareness Day. The event, which Apple has marked in a variety of ways over the years, was created to promote access to technology and foster inclusion for people with disabilities.
Apple’s homepage includes a banner image highlighting the accessibility features of its products with the statement ‘Technology is most powerful when it empowers everyone.’ The Accessibility section elaborates on that idea with a video first published in 2016 and the following:
Taking a family portrait. Catching up over FaceTime. Raising the blinds to let in the morning light. We want everyone to enjoy the everyday moments that technology helps make possible, so we work to make every Apple product accessible from the very start. Because the true value of a device isn’t measured by how powerful it is, but by how much it empowers you.
The remainder of the Accessibility page is organized into sections dedicated to vision, hearing, physical and motor skills, and learning and literacy that outline accessibility features of Apple’s products and demonstrates them with animations and videos.
The updates to Apple’s website and its announcement that it is working with schools for blind and deaf students to help teach them to code is a reminder of the importance of accessibility to the company that hopefully encourages third-party developers to make accessibility a priority too.
In March, Apple lead a Swift Playgrounds course at the Texas School for the Blind and Visually Impaired. Today, which is Global Accessibility Awareness Day, Apple announced that is partnering with schools in California, Texas, New York, Florida, Illinois, and Massachusetts:
Beginning this fall, schools supporting students with vision, hearing or other assistive needs will start teaching the Everyone Can Code curricula for Swift, Apple’s powerful and intuitive programming language.
Tim Cook, Apple’s CEO, said:
“Apple’s mission is to make products as accessible as possible,” said Tim Cook, Apple’s CEO. “We created Everyone Can Code because we believe all students deserve an opportunity to learn the language of technology. We hope to bring Everyone Can Code to even more schools around the world serving students with disabilities.”
In addition to existing iOS accessibility features, Apple is augmenting the Everyone Can Code curricula with tools and resources targeted at students with visual and hearing impairments.
Apple has proposed a set of accessibility emoji to the Unicode Consortium. According to Emojipedia:
In the opening line of the proposal, Apple writes:
“Apple is requesting the addition of emoji to better represent individuals with disabilities. Currently, emoji provide a wide range of options, but may not represent the experiences of those with disabilities”
Noting that this is “not meant to be a comprehensive list of all possible depictions of disabilities”, Apple goes on to explain that this is intended as “an initial starting point”.
Apple has worked with the American Council of the Blind, the Cerebral Palsy Foundation, and the National Association of the Deaf to develop the emoji.
Among the emoji included in the set are Guide Dog With Harness, Person With White Cane, Ear With Hearing Aid, Deaf Sign, Person in Mechanized Wheelchair, Person in Manual Wheelchair, Mechanical or Prosthetic Arm and Leg, and Service Dog With Vest and Leash.
The proposed emoji, if adopted, wouldn’t appear until Unicode 12.0 is released sometime in the first half of 2019.
Earlier this week, Apple engineers visited the Texas School for the Blind and Visually Impaired where they led a programming course from the company’s Everyone Can Code curriculum. According to the Austin Statesman’s technology blog, Open Source, the class was the first such session led by Apple for blind and low-vision students.
With the assistance of VoiceOver, the students completed assignments in Apple’s Swift Playgrounds iPad app. The students also got a chance to go outside and fly Parrot drones using Swift Playgrounds. Viki Davidson, a technology teacher at the school, told Open Source:
“We see this as a way to get them interested in coding and realize this could open job opportunities,” said Vicki Davidson, a technology teacher at the Texas School for the Blind and Visually Impaired. “Apple has opened up a whole new world for kids by giving them instant access to information and research, and now coding.”
Apple’s director of accessibility, Sarah Herrlinger, who will participate in a session on Innovations in Accessibility at South By Southwest on March 15th, said:
“When we said everyone should be able to code, we really meant everyone,” said Sarah Herrlinger, Apple’s director of accessibility. “Hopefully these kids will leave this session and continue coding for a long time. Maybe it can inspire where their careers can go.”
Swift Playgrounds and Apple’s Everyone Can Code curriculum have grown at a remarkable rate and are fantastic resources for students, teachers, and parents. However, it’s Apple’s long-standing commitment to accessibility across all of its products that helps ensure that those resources are available to as many students as possible.
Steven Aquino covers an important accessibility angle of smart speakers that I've never truly considered:
Since the HomePod started shipping last week, I’ve taken to Twitter on multiple occasions to (rightfully) rant about the inability of Siri—and its competitors—to parse non-fluent speech. By “non-fluent speech,” I’m mostly referring to stutterers because I am one, but it equally applies to others, such as deaf speakers.
This is a topic I’ve covered before. There has been much talk about Apple’s prospects in the smart speaker market; the consensus seems to be the company lags behind Amazon and Google because Alexa and Google Home are smarter than Siri. What is missing from these discussions and from reviews of these products is the accessibility of a HomePod or Echo or Sonos.
As I see it, this lack of consideration, whether intentional or not, overlooks a crucial part of a speaker product’s story. Smart speakers are a unique product, accessibility-wise, insofar as the voice-first interaction model presents an interesting set of conditions. You can accommodate for blindness and low vision with adjustable font sizes and screen readers. You can accommodate physical motor delays with switches. You can accommodate deafness and hard-of-hearing with closed captioning and using the camera’s flash for alerts.
But how do you accommodate for a speech impairment?
A human assistant would know how to deal with stuttering, dialects, or even just the need to repeat a part of a sentence you got wrong. None of the modern digital assistants currently goes beyond being a slightly humanized command line activated by voice, and I wonder who will get there first.
Steven Aquino on the accessibility implications of Face ID on the iPhone X:
The way Apple has built Face ID, hardware- and software-wise, into iOS quite literally makes using iPhone a “hands-free” experience in many regards. And that’s without discrete accessibility features like Switch Control or AssistiveTouch. That makes a significant difference to users, myself included, whose physical limitations make even the most mundane tasks (e.g., unlocking one’s device) tricky. As with so many accessibility-related topics, the little things that are taken for granted are always the thin
The combination of Face ID with Raise to Wake (or, arguably, the simplicity of Tap to Wake) truly sounds like a remarkable improvement accessibility-wise, perhaps in a way that we didn’t foresee when we started speculating on Apple abandoning Touch ID. Hands-free unlocking is one of my favorite aspects of the iPhone X experience so far.
Last week we reported on a new cochlear implant that was designed to integrate in special ways with an iPhone. This week, Steven Levy has more details for WIRED on the work that went into bringing this product to fruition.
To solve the huge problem of streaming high-quality audio without quickly draining the tiny zinc batteries in hearing aids, Apple had previously developed a new technology called Bluetooth LEA, or Low Energy Audio. The company released that (but didn’t talk about it) when the first Made for iPhone hearing aids appeared in 2014...“We chose Bluetooth LE technology because that was the lowest power radio we had in our phones,” says Sriram Hariharan, an engineering manager on Apple’s CoreBluetooth team. To make LEA work with cochlear implants he says, “We spent a lot of time tuning our solution it to meet the requirements of the battery technology used in the hearing aids and cochlear implants.” Apple understood that, as with all wireless links, some data packets would be lost in transmission—so the team figured out how to compensate for that, and re-transmit them as needed. “All those things came together to figure out how to actually do this,” says Hariharan.
This story perfectly demonstrates how solving accessibility issues may require a lot of hard work and investment, but in the end it can produce results that are truly life-changing.
Today Cochlear introduced a new cochlear implant sound processor that serves as the first such device directly compatible with iOS devices. The company’s press release states:
With the Nucleus 7 Sound Processor, people with a Cochlear Nucleus Implant can now stream sound directly from a compatible iPhone, iPad and iPod touch directly to their sound processor. They will also be able to control, monitor and customize their hearing on their iPhone or iPod touch through the Nucleus® Smart App available to download for free from the App Store®.
The Nucleus Smart app also includes a feature resembling Apple’s ‘Find My iPhone’ called ‘Find My Processor.’ Especially helpful for children who may be more prone to losing their sound processor, this feature employs an iPhone’s built-in location services to determine the last place the processor was connected to its paired iPhone.
Sarah Buhr of TechCrunch notes that today’s announcement is the fruit of a lengthy period of research and development within Apple in response to the growing issue of hearing loss.
Apple...has spent a number of years developing a hearing aid program within the company. Apple soon developed a protocol the company offered for free for hearing aid and implant manufacturers to use with their devices.