This Week's Sponsor:

Kolide

Ensures that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "accessibility"

Hello, Computer: Inside Apple’s Voice Control

This year’s Worldwide Developers Conference was big. From dark mode in iOS 13 to the newly-rechristened iPadOS to the unveiling of the born-again Mac Pro and more, Apple’s annual week-long bonanza of all things software was arguably one of the most anticipated and exciting events in recent Apple history.

Accessibility certainly contributed to the bigness as well. Every year Apple moves mountains to ensure accessibility’s presence is felt not only in the software it previews, but also in the sessions, labs, and other social gatherings in and around the San Jose Convention Center.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,” Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, said to me following the keynote. “There’s something in each operating system and things for a lot of different types of use cases.”

One announcement that unquestionably garnered some of the biggest buzz during the conference was Voice Control. Available on macOS Catalina and iOS 13, Voice Control is a method of interacting with one’s Mac or iOS device using only your voice. A collaborative effort between Apple’s Accessibility Engineering and Siri groups, Voice Control aims to revolutionize the way users with certain physical motor conditions access their devices. At a high level, it’s very much a realization of the kind of ambient, voice-first computing dreamed up by sci-fi television stalwarts like The Jetsons and Star Trek decades ago. You talk, it responds.

And Apple could not be more excited about it.

Read more


macOS Accessibility Keyboard

Lovely deep dive by Mattt Thompson on one of macOS’ most powerful Accessibility features – the Accessibility Keyboard:

Today, what’s the difference between a MacBook and an iPad? Practically speaking, you might point to the presence or absence of a physical keyboard, a SIM card, or an ARM processor (and if the rumors about next year’s MacBook models are to believed, those latter two may soon cease to be a distinction).

For many of us, a physical keyboard is the defining trait that makes a computer a “desktop” computer in the traditional sense; when you purchase an external keyboard for your iPad, you do so to make it “desktop”-like. But for many others — including those of us with a physical disability — a typewriter-like keyboard is but one of many input methods available to desktop users.

This week on NSHipster, we’re taking a look at the macOS Accessibility Keyboard. Beyond its immediate usefulness as an assistive technology, the Accessibility Keyboard challenges us to think differently about the nature of input methods and any remaining distinction between mobile and desktop computers.

Combined with the Panel Editor app, macOS allows you to design any kind of “keyboard” that goes beyond text input. I’ve written about this topic before when I shared my custom Accessibility Keyboard setup to launch AppleScripts, which you can find here.

Permalink

A Timeline of iOS Accessibility: It Started with 36 Seconds

On June 8, 2009, at the end of a two-hour WWDC keynote, Phil Schiller was running through a long list of new features and apps that would be available on the iPhone 3GS, due to ship on June 19 of that year. Phil was pinch-hitting as keynote master of ceremonies for Steve Jobs, who was then on leave, recovering from a liver transplant.

At 1:51:54 in the video, just after he showed off Voice Control and the new Compass app, Apple’s version of the accessibility logo appeared on screen. It’s a stick-style figure with arms and legs outstretched. The icon is still used today.

“We also care greatly about accessibility,” Schiller said, and the slide switched to an iPhone settings screen.

For a total of 36 seconds, Schiller spoke somewhat awkwardly about VoiceOver, Zoom, White on Black (called Invert Colors from iOS 6 onward), and Mono Audio – the first real accessibility features on the iPhone OS platform, as it was then called.

And then it was over. No demo. No applause break.

Schiller moved on to describe the Nike+ app and how it would allow iPhone users to meet fitness goals.

I surveyed a number of liveblogs from that day. About half noted the mention of accessibility features in iPhone OS. The others jumped directly from Compass to Nike+. Accessibility hadn’t made much of a splash.

But in the blindness community, things were very different. Time seemed to stop somewhere after 1:51:54 in the video. Something completely amazing had happened, and only a few people seemed to understand what it meant.

Some were overjoyed, some were skeptical, some were in shock. They all had questions. Would this be a half-hearted attempt that would allow Apple to fill in the checkboxes required by government regulations, a PR stunt to attract good will? Or would it mean that people who had previously been completely locked out of the iPhone would have a way in?

You can probably guess what the answer is, now that we have ten years of an accessible mobile platform in the rearview mirror – now that Apple is widely credited with offering the best mobile accessibility experience available. But it didn’t all happen at once, and not every step along the way was a positive one.

0:00
1:51

Excerpt from ‘36 Seconds That Changed Everything.’

As a companion to my audio documentary, “36 Seconds That Changed Everything: How the iPhone Learned to Talk,” I’ve put together a timeline of iOS accessibility milestones from the past ten years. I’ve focused on Apple hardware and operating systems, though there have also been important Apple app updates, and third-party apps that opened doors to new ways of using iOS accessibly. It’s a list that’s simply too long for this article. And, with a few exceptions, I’ve addressed accessibility-specific features of iOS. Many mainstream features have accessibility applications and benefits, even if they don’t fit here directly.

Read more


Surveying Apple’s Latest Accessibility Work

Steven Aquino, writing for TechCrunch:

Although much of the conversation around what Apple announced revolves around iPadOS and Project Catalyst, based on what I’m hearing on podcasts and seeing in my Twitter timeline, Voice Control definitely is a crown jewel too. Nearly everyone has praised not only the engineering that went into developing it, but also the fact that Apple continues to lead the industry at making accessibility a first-class citizen. Myke Hurley said it best on the Upgrade podcast following the event, the weekly show he co-hosts with Jason Snell, when he said Voice Control is something Apple doesn’t have to do. They do it, he said, because it’s the right thing to do for every user.

Aquino interviewed Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, about three major new accessibility features: Voice Control, Hover Text, and pointing device support. While the iPad enthusiast in me is all about those pointing devices, Voice Control is a tremendously exciting technology that I hope has ramifications for the future of Siri.

Permalink

Why Shortcuts Matter for Accessibility

Since it was announced at WWDC over the summer, the lion’s share of conversation around shortcuts has been about getting things done quickly and efficiently. Apple’s marketing message focuses on how shortcuts in iOS 12 help “streamline the things you do often” using Siri and/or the Shortcuts app. The company also recently put out a press release highlighting top App Store apps that have integrated shortcuts to extend their functionality, touting them for “making [users’] favorite apps even easier to use with a simple tap or by asking Siri.”

While the convenience factor of shortcuts is appreciated, an important aspect to their utility is accessibility. It’s a crucial aspect of the story around shortcuts, because while everyone loves a time-saver or two, these workflows also have the potential to make iPhone and iPad more accessible. In an accessibility context, shortcuts can be lifesavers in terms of reducing cognitive load, excessive swiping and tapping, and other common points of friction often met by disabled users.

Shortcuts, Past and Present

Before considering shortcuts as an accessibility tool, it’s important to understand their roots in order to properly frame them into perspective. The idea that shortcuts, or workflows, can prove valuable as an assistive technology isn’t a novel one.

Workflow, on which the Shortcuts app is based, was acquired by Apple in early 2017. Two years earlier, however, Apple selected Workflow as an Apple Design Award winner primarily for its integration of iOS accessibility features. Ari Weinstein, who joined Apple to work on Shortcuts post-acquisition, told me in an interview at WWDC 2015 that he and his team received feedback from several blind and visually impaired users who were curious about Workflow and wanted to try it. As a result, the team felt adding VoiceOver support was “the right thing to do,” Weinstein said.

To paraphrase Kendrick Lamar, Shortcuts got accessibility in its DNA.

Given the history lesson, it’s not at all far-fetched to think the Shortcuts app would have appeal to disabled users. Like Overcast and Twitterrific, Shortcuts is an app built for the mainstream, yet it has the care and design sensibility to carry relevance for a variety of use cases, like being fully accessible to a blind user via VoiceOver. This isn’t small potatoes; given Apple’s commitment to the disabled community, it’s certainly plausible Workflow’s ode to accessibility made the app all the more desirable.

More Than Just Productivity

As I reported during WWDC, Apple’s focus this year, software-wise, marked a departure from how they’ve traditionally approached accessibility enhancements. Unlike past years, there were no new discrete accessibility features for any platform. (AirPods with Live Listen is close). Instead, Apple chose to hammer on the idea that the tentpole features (e.g. Group FaceTime in iOS 12, Walkie-Talkie in watchOS 5) can be enabling technologies. The overarching theme of the conference was that the new features were so well designed that they brought inherent accessibility gains.

Siri shortcuts is another of those features. In my briefings with Apple at WWDC and since, shortcuts has been one of the first items they wanted to discuss. Like Group FaceTime and others, the company firmly believes in shortcuts’ potential as an accessibility aid. Their enthusiasm is warranted: for many users with certain cognitive and/or physical motor delays, the consolidation of tasks can reduce friction associated with remembering how to perform a task and then doing it. In this way, shortcuts are the inverse of task analyses; rather than extrapolating tasks into their individual parts (e.g. tapping a series of buttons in an app), the Shortcuts app’s automation turns them into a single step. (You break down steps when creating your own workflows, but that’s beside the point being made here.) Lest we forget about Siri; being able to use your voice to activate shortcuts is a boon for people with motor delays, as the “hands free” experience can be empowering.

For disabled people, shortcuts’ focus on speed and accessibility can open up new possibilities in terms of what they can do with their iOS devices and how they do things. Throw in system accessibility features like VoiceOver and Dynamic Type, and the Shortcuts app becomes far more compelling than simply being a sheer productivity tool.

”We see huge accessibility potential with Siri Shortcuts and the Shortcuts app. It’s already making a difference — helping people across a wide range of assistive needs simplify every-day tasks like getting to work, coming home, or staying in touch with friends and family,” Sarah Herrlinger, Apple’s Senior Director of Global Accessibility Policy & Initiatives, said in a statement. “We’re getting great feedback about how powerful the technology is in streamlining frequent tasks and integrating multiple app functions with just a single voice command or tap.”

How I Use Shortcuts

I am far less prolific in my adoption of shortcuts than some people. Others, like Federico and Matthew Cassinelli, are far more well-versed in the intricacies of what is possible and, more importantly, how you chain certain commands together.

My needs for shortcuts are pretty spartan. The shortcuts I use most often are practical, everyday ones I found in the Gallery section of the app. I currently have thirteen shortcuts; of those, the ones that are the most heavily-used are the laundry timer, tip calculator, and one for texting my girlfriend. While I have enjoyed spelunking through Federico’s work for esoteric, power user shortcuts, the reality is my work doesn’t require much automation. I typically don’t need to do fancy things with images, text, and the like. That isn’t to say these tools aren’t cool or valuable; they’re just not necessarily for me. For my needs, quick access to, say, the laundry timer is worth its weight in gold because I always forget to move my clothes.

Consider another shortcut of mine, Play an Album. I’ve been listening to Eminem’s new album, Kamikaze, virtually non-stop since it came out at the end of August. Rather than manually launch the Music app, find the album in my recently played queue, and hit play, I can utilize the Shortcuts widget to play it with a single tap. The manual method is three steps which, while not tedious for me in any way, is more work. Going back to the task analysis analogy I used earlier, not only is Play an Album faster, it particularly helps me conserve precious visual energy I otherwise would have expended finding the album. For fine-motor skills, the shortcut also saves on potential cramping in my fingers caused by my cerebral palsy. Again, what can take multiple taps can be condensed into a single motion. For many, that’s a huge win.

The same concept applies to sending iMessages to my girlfriend. Using the shortcut, what would normally be a multi-step process is reduced to a single step. The advantage for me is a matter of kinetics, but for others, the advantage very well could reduce cognitive load and increase executive function. Not insignificant.

The Bottom Line

As is the case with stuff like Markdown and Apple Pay, technologies not built expressly for accessibility’s sake, the Shortcuts app is so well considered and approachable that anyone can use it, regardless of ability. There are no complicated settings or special modes; as Apple designed it, it just works as they intended it.

That’s what makes Shortcuts’ star shine brighter. Yes, Apple is pitching it for speed and convenience. Yes, shortcuts can be as pedestrian or as nerdy as you want them to be. Above all, however, the Shortcuts app is accessible. It’s an app that’s reachable to the widest possible audience, turning its utilitarianism into something far greater.


Consistent Accessibility: How Apple Can Make iOS and the Mac More Accessible

Like all Apple products, macOS is an accessible platform. Blind and low vision users can navigate their Mac using VoiceOver, while someone who has physical motor delays can use Switch Control to edit videos in Final Cut. And under the Accessibility pane in System Preferences, there is a multitude of other features one can use, ranging from Zoom to Invert Colors to closed-captioning and more. Whatever your need, the breadth and depth of Apple’s accessibility software spans many domains. This is why Apple is lauded as the industry leader in accessibility: the tools run deep and they’re well-designed.

Still, accessibility on macOS doesn’t quite reach feature parity with iOS. Amidst rumors that Apple is working on a cross-platform set of APIs to bridge the company’s two primary operating systems, now is an opportune time to consider what each platform does and what they offer one another.

In the context of accessibility, the way Apple brings consistency between iOS and macOS is by sharing features and technologies among the two. As such, there are some iOS-first features macOS sorely needs, while the Mac offers things iOS would benefit from as well. Such enhancements would not only improve the user experience across devices, but also would make iOS and Mac software richer, fuller products overall. And most importantly, more accessible.

Read more


Apple Updates Its Website for Global Accessibility Awareness Day

Apple has updated its homepage and accessibility webpage to celebrate Global Accessibility Awareness Day. The event, which Apple has marked in a variety of ways over the years, was created to promote access to technology and foster inclusion for people with disabilities.

Apple’s homepage includes a banner image highlighting the accessibility features of its products with the statement ‘Technology is most powerful when it empowers everyone.’ The Accessibility section elaborates on that idea with a video first published in 2016 and the following:

Taking a family portrait. Catching up over FaceTime. Raising the blinds to let in the morning light. We want everyone to enjoy the everyday moments that technology helps make possible, so we work to make every Apple product accessible from the very start. Because the true value of a device isn’t measured by how powerful it is, but by how much it empowers you.

The remainder of the Accessibility page is organized into sections dedicated to vision, hearing, physical and motor skills, and learning and literacy that outline accessibility features of Apple’s products and demonstrates them with animations and videos.

The updates to Apple’s website and its announcement that it is working with schools for blind and deaf students to help teach them to code is a reminder of the importance of accessibility to the company that hopefully encourages third-party developers to make accessibility a priority too.


Apple Announces ‘Everyone Can Code’ Partnership with Schools for Blind and Deaf Students

In March, Apple lead a Swift Playgrounds course at the Texas School for the Blind and Visually Impaired. Today, which is Global Accessibility Awareness Day, Apple announced that is partnering with schools in California, Texas, New York, Florida, Illinois, and Massachusetts:

Beginning this fall, schools supporting students with vision, hearing or other assistive needs will start teaching the Everyone Can Code curricula for Swift, Apple’s powerful and intuitive programming language.

Tim Cook, Apple’s CEO, said:

“Apple’s mission is to make products as accessible as possible,” said Tim Cook, Apple’s CEO. “We created Everyone Can Code because we believe all students deserve an opportunity to learn the language of technology. We hope to bring Everyone Can Code to even more schools around the world serving students with disabilities.”

In addition to existing iOS accessibility features, Apple is augmenting the Everyone Can Code curricula with tools and resources targeted at students with visual and hearing impairments.

Permalink

Apple Proposes Accessibility Emoji

Apple has proposed a set of accessibility emoji to the Unicode Consortium. According to Emojipedia:

In the opening line of the proposal, Apple writes:

“Apple is requesting the addition of emoji to better represent individuals with disabilities. Currently, emoji provide a wide range of options, but may not represent the experiences of those with disabilities”

Noting that this is “not meant to be a comprehensive list of all possible depictions of disabilities”, Apple goes on to explain that this is intended as “an initial starting point”.

Apple has worked with the American Council of the Blind, the Cerebral Palsy Foundation, and the National Association of the Deaf to develop the emoji.

Among the emoji included in the set are Guide Dog With Harness, Person With White Cane, Ear With Hearing Aid, Deaf Sign, Person in Mechanized Wheelchair, Person in Manual Wheelchair, Mechanical or Prosthetic Arm and Leg, and Service Dog With Vest and Leash.

The proposed emoji, if adopted, wouldn’t appear until Unicode 12.0 is released sometime in the first half of 2019.

Permalink