This Week's Sponsor:

Albums

Algorithm-Free Listening for Music Lovers


Posts tagged with "accessibility"

Apple Marks Global Accessibility Awareness Day by Announcing Upcoming Accessibility Features

Thursday is Global Accessibility Awareness Day. To mark the occasion, Apple has announced a long list of accessibility features coming to its products later this year and shared other ways it is celebrating the day through its apps and services.

Apple’s press release sums up the features coming to the iPhone, iPad, Mac, and Apple Watch as follows:

Using advancements across hardware, software, and machine learning, people who are blind or low vision can use their iPhone and iPad to navigate the last few feet to their destination with Door Detection; users with physical and motor disabilities who may rely on assistive features like Voice Control and Switch Control can fully control Apple Watch from their iPhone with Apple Watch Mirroring; and the Deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Mac. Apple is also expanding support for its industry-leading screen reader VoiceOver with over 20 new languages and locales. These features will be available later this year with software updates across Apple platforms.

Door Detection will be part of the Magnifier app later this year. The feature helps blind and low vision users find and navigate doors when they arrive somewhere. The feature will judge the distance to the door using LiDAR, describe the door’s attributes, like whether it opens by pushing or using a doorknob, and read signs and symbols next to doors.

Apple Watch Mirroring.

Apple Watch Mirroring.

The Apple Watch will add several Physical and Motor accessibility features too. Apple Watch Mirroring, which is built on AirPlay in part, will allow users with physical and motor disabilities to control their Watches from an iPhone using Voice Control, Switch Control, voice commands, sound actions, head tracking, and Made for iPhone switches. The Apple Watch will also add a new double pinch gesture for controlling, like answering and ending phone calls and taking photos.

Apple Watch will add a new double pinch gesture.

Apple Watch will add a new double pinch gesture.

For Deaf and hard of hearing users, Apple will add Live Captions on the iPhone, iPad, and Mac. Apple says captioning will work with video calling apps like FaceTime, streaming video services, video conferencing apps, and in-person conversations. Live Captions occur on-device to preserve privacy, and on the Mac, users will be able to type a response that will be spoken aloud.

VoiceOver will get an update, too, adding the following languages:

  • Arabic (World)
  • Basque
  • Bengali (India)
  • Bhojpuri (India)
  • Bulgarian
  • Catalan
  • Croatian
  • Farsi
  • French (Belgium)
  • Galician
  • Kannada
  • Malay
  • Mandarin (Liaoning, Shaanxi, Sichuan)
  • Marathi
  • Shanghainese (China)
  • Spanish (Chile)
  • Slovenian
  • Tamil
  • Telugu
  • Ukrainian
  • Valencian
  • Vietnamese

VoiceOver on the Mac will also gain Text Checker that will discover formatting issues.

Additional upcoming accessibility features.

Additional upcoming accessibility features.

Apple previewed several other upcoming accessibility features across its products, including:

  • Buddy Controller, the ability for someone to use a second game controller to assist with playing a game as though the two controllers were one
  • Siri Pause Time, which will allow users to customize the period Siri waits before responding to a user
  • Voice Control Spelling Mode, for dictating words letter-by-letter
  • Customizable sound recognition of the sounds in your environment
  • New themes and text adjustments in the Books app for a more accessible reading experience
Apple apps and services are celebrating Global Accessibility Awareness Day, too.

Apple apps and services are celebrating Global Accessibility Awareness Day, too.

Also, Apple has announced that Global Accessibility Awareness Day is being celebrated with Apple Store sessions, an Accessibility Assistant shortcut in the Shortcuts app, special Fitness+ sessions and Time to Walk or Push episodes, an accessibility-oriented Maps guide, and highlighted content on the App Store and in Apple Books Apple Podcasts, Apple Music and Apple TV.

We’ve seen Apple announce accessibility features coming to future versions of its OSes before, but today’s announcement is unique given the number of features revealed. I’m eager to try these features out. Based on what Apple has said, there seems to be a lot here that will make meaningful impacts on a lot of users’ everyday lives.


WordleBot 1.1, Now Fully Accessible with Native Emoji-to-Image Conversion

WordleBot 1.1.

WordleBot 1.1.

Following the release of my WordleBot shortcut last week, I’ve received a lot of useful and informative feedback from users in the accessibility community regarding the shortcut’s ability to annotate Wordle results with descriptions. Although well-intentioned, my original approach was misguided: even with line-based scores, the grid of emoji characters still performed horribly with screen-reading technologies such as Apple’s VoiceOver. WordleBot didn’t do much to make results more accessible for VoiceOver users since it was only reformatting the grid of emoji characters with additional text.

Read more


Apple Announces SignTime and Many Other Accessibility Features Coming to Its Products

Source: Apple.

Source: Apple.

Tomorrow, Apple will launch SignTime, a service that will pair Apple Store and Apple Support customers with on-demand sign language interpreters. The company has several other accessibility features coming too, which will be released later this year.

SignTime will allow customers to communicate with AppleCare and Retail Customer Care inside their browsers using American Sign Language, British Sign Language, and French Sign Language. The service will also be available in-person at retail stores without making arrangements ahead of time. For now, the service is limited to the US, UK, and France, but Apple says it will roll out to more countries over time.

Later this year, Apple will add AssistiveTouch to the Apple Watch. The feature uses hand clenches, pinch gestures, and hand shaking to navigate and select controls in Watch apps. Assistive Touch for the Apple Watch takes advantage of the device’s gyroscope and accelerometer along with the heart rate sensor and machine learning.

Also, coming later this year, the iPad will gain support for third-party eye-tracking devices to assist users in navigating the iPad’s UI. VoiceOver is also being enhanced with new details about people, text, table data, and other objects. The feature will offer far more descriptive information for blind and low vision users than ever before. Users will also be able to add their own image descriptions to their photos using Markup.

For deaf and hard of hearing users, Apple is adding bi-directional hearing aid support. The company is also including support for audiograms, which can be used with Headphone Accommodations to tune playback to a user’s hearing. Background sounds like balanced, bright, or dark noise, and ocean, rain, and stream sounds are being added too.

Source: Apple.

Source: Apple.

Also coming later this year are Sound Actions for Switch Control that uses mouth sounds in place of switches and buttons, customizable display and text size settings for colorblind users, and new Memoji customizations to allow users to add oxygen tubes, cochlear implants, and a soft helmet for headwear. Many of Apple’s apps and services will also be marking Global Accessibility Awareness Day, which is tomorrow, with features, curated collections of content, and sessions.

It’s fantastic to see such a long list of accessibility features being added to Apple’s devices tomorrow and later in the year, which presumably means this fall when the company’s OSes are expected to be updated. It’s also unusual for upcoming OS features to be revealed before WWDC, but with Global Accessibility Awareness Day coming tomorrow, it’s an excellent opportunity to spotlight these important additions to Apple’s OSes and perhaps to free up time at WWDC for other announcements.


Apple to Host Interactive Accessibility Event Online as Possible Warm-Up for WWDC

As noted by Steve Troughton-Smith on Twitter and reported by 9to5Mac, Apple has invited developers to attend an online event to learn about the accessibility features of its devices. According to an email message sent to developers, the event will include opportunities to ask questions during and after the presentation and schedule individual consultations.

Last month, Apple announced that WWDC will be online-only this June for the first time. As Troughton-Smith suggested in his tweet, it’s not hard to imagine that Apple is using this week’s accessibility event to test systems that it will use to move WWDC online.

Apple periodically holds events for developers outside the annual WWDC cycle, but this event is a little different, especially the interactive component. I’m curious to see how the accessibility event goes and the mechanics Apple uses to implement developer participation.


For iPad, Accessibility Gives ‘It’s Just a Big iPhone’ New Meaning

Perhaps the most common complaint hurled against the iPad over its first decade of life is that it‘s little more than a bigger iPhone. At a fundamental level, the criticism is certainly valid: by and large, the iPad runs the same software as the iPhone. The penchant for bemoaning this bigness emanates from discontentment over the fact that substantial improvements to the iPad’s software have come at a glacially slow pace. Until last year, meaningful upgrades tailored to the tablet were few and far between.1 As much as Apple has extolled the iPad for being “unlike any computer,” the truth is the product stagnated for quite a while in terms of software.2 For better or worse, the company has been preoccupied with savoring every last drop of mother’s milk from the cash cow that is the iPhone. The iPad was left to wither thirstily when it came to its own growth, and it suffered for some time as a result.

In actuality, the iPad being more or less a scaled-up iPhone isn’t necessarily an entirely bad thing. The reason is iOS; familiarity breeds comfort – Apple shrewdly created the iPad’s user interface (and to lesser extents, Apple Watch and Apple TV) to largely resemble the iPhone. Especially for less nerdy users, the consistency across devices makes for a seamless, less intimidating experience. From icons to text to features to the touchscreen, the iPad being so similar to the iPhone means acclimating to the device takes minimal time and effort. From an accessibility standpoint, easy acclimation sets the tone for an enjoyable user experience. The foremost reason this is important is that the easier it is to acclimate to a device, the easier it is to find and configure mission-critical accessibility features.

Thus, it’s not at all unreasonable to look at what was heretofore a pejorative assessment – the iPad is nothing but a big iPhone – and turn it into a positive. One of the unheralded aspects of the device’s success is how its approachable, intuitive nature has made it a hit in accessibility-centric contexts such as special education classrooms and as a communicative aid. Such advances get right at the heart of the oft-cited Steve Jobs quote on the so-called intersection of technology and the liberal arts, when he said, “It’s in Apple’s DNA that technology alone is not enough.” Assistive technology obviously caters to the humanities part of the liberal arts, and it’s not hard to see how the iPad’s roots as ostensibly a bigger iPhone can be an asset rather than a liability. You just have to be willing to keep an open mind.

Read more


Hello, Computer: Inside Apple’s Voice Control

This year’s Worldwide Developers Conference was big. From dark mode in iOS 13 to the newly-rechristened iPadOS to the unveiling of the born-again Mac Pro and more, Apple’s annual week-long bonanza of all things software was arguably one of the most anticipated and exciting events in recent Apple history.

Accessibility certainly contributed to the bigness as well. Every year Apple moves mountains to ensure accessibility’s presence is felt not only in the software it previews, but also in the sessions, labs, and other social gatherings in and around the San Jose Convention Center.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,” Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, said to me following the keynote. “There’s something in each operating system and things for a lot of different types of use cases.”

One announcement that unquestionably garnered some of the biggest buzz during the conference was Voice Control. Available on macOS Catalina and iOS 13, Voice Control is a method of interacting with one’s Mac or iOS device using only your voice. A collaborative effort between Apple’s Accessibility Engineering and Siri groups, Voice Control aims to revolutionize the way users with certain physical motor conditions access their devices. At a high level, it’s very much a realization of the kind of ambient, voice-first computing dreamed up by sci-fi television stalwarts like The Jetsons and Star Trek decades ago. You talk, it responds.

And Apple could not be more excited about it.

Read more


macOS Accessibility Keyboard

Lovely deep dive by Mattt Thompson on one of macOS’ most powerful Accessibility features – the Accessibility Keyboard:

Today, what’s the difference between a MacBook and an iPad? Practically speaking, you might point to the presence or absence of a physical keyboard, a SIM card, or an ARM processor (and if the rumors about next year’s MacBook models are to believed, those latter two may soon cease to be a distinction).

For many of us, a physical keyboard is the defining trait that makes a computer a “desktop” computer in the traditional sense; when you purchase an external keyboard for your iPad, you do so to make it “desktop”-like. But for many others — including those of us with a physical disability — a typewriter-like keyboard is but one of many input methods available to desktop users.

This week on NSHipster, we’re taking a look at the macOS Accessibility Keyboard. Beyond its immediate usefulness as an assistive technology, the Accessibility Keyboard challenges us to think differently about the nature of input methods and any remaining distinction between mobile and desktop computers.

Combined with the Panel Editor app, macOS allows you to design any kind of “keyboard” that goes beyond text input. I’ve written about this topic before when I shared my custom Accessibility Keyboard setup to launch AppleScripts, which you can find here.

Permalink

A Timeline of iOS Accessibility: It Started with 36 Seconds

On June 8, 2009, at the end of a two-hour WWDC keynote, Phil Schiller was running through a long list of new features and apps that would be available on the iPhone 3GS, due to ship on June 19 of that year. Phil was pinch-hitting as keynote master of ceremonies for Steve Jobs, who was then on leave, recovering from a liver transplant.

At 1:51:54 in the video, just after he showed off Voice Control and the new Compass app, Apple’s version of the accessibility logo appeared on screen. It’s a stick-style figure with arms and legs outstretched. The icon is still used today.

“We also care greatly about accessibility,” Schiller said, and the slide switched to an iPhone settings screen.

For a total of 36 seconds, Schiller spoke somewhat awkwardly about VoiceOver, Zoom, White on Black (called Invert Colors from iOS 6 onward), and Mono Audio – the first real accessibility features on the iPhone OS platform, as it was then called.

And then it was over. No demo. No applause break.

Schiller moved on to describe the Nike+ app and how it would allow iPhone users to meet fitness goals.

I surveyed a number of liveblogs from that day. About half noted the mention of accessibility features in iPhone OS. The others jumped directly from Compass to Nike+. Accessibility hadn’t made much of a splash.

But in the blindness community, things were very different. Time seemed to stop somewhere after 1:51:54 in the video. Something completely amazing had happened, and only a few people seemed to understand what it meant.

Some were overjoyed, some were skeptical, some were in shock. They all had questions. Would this be a half-hearted attempt that would allow Apple to fill in the checkboxes required by government regulations, a PR stunt to attract good will? Or would it mean that people who had previously been completely locked out of the iPhone would have a way in?

You can probably guess what the answer is, now that we have ten years of an accessible mobile platform in the rearview mirror – now that Apple is widely credited with offering the best mobile accessibility experience available. But it didn’t all happen at once, and not every step along the way was a positive one.

0:00
1:51

Excerpt from ‘36 Seconds That Changed Everything.’

As a companion to my audio documentary, “36 Seconds That Changed Everything: How the iPhone Learned to Talk,” I’ve put together a timeline of iOS accessibility milestones from the past ten years. I’ve focused on Apple hardware and operating systems, though there have also been important Apple app updates, and third-party apps that opened doors to new ways of using iOS accessibly. It’s a list that’s simply too long for this article. And, with a few exceptions, I’ve addressed accessibility-specific features of iOS. Many mainstream features have accessibility applications and benefits, even if they don’t fit here directly.

Read more


Surveying Apple’s Latest Accessibility Work

Steven Aquino, writing for TechCrunch:

Although much of the conversation around what Apple announced revolves around iPadOS and Project Catalyst, based on what I’m hearing on podcasts and seeing in my Twitter timeline, Voice Control definitely is a crown jewel too. Nearly everyone has praised not only the engineering that went into developing it, but also the fact that Apple continues to lead the industry at making accessibility a first-class citizen. Myke Hurley said it best on the Upgrade podcast following the event, the weekly show he co-hosts with Jason Snell, when he said Voice Control is something Apple doesn’t have to do. They do it, he said, because it’s the right thing to do for every user.

Aquino interviewed Sarah Herrlinger, Apple’s Director of Global Accessibility Policy & Initiatives, about three major new accessibility features: Voice Control, Hover Text, and pointing device support. While the iPad enthusiast in me is all about those pointing devices, Voice Control is a tremendously exciting technology that I hope has ramifications for the future of Siri.

Permalink