This Week's Sponsor:

FastMinder

A Simple, Lightweight, and Clutter-Free Intermittent Fasting Tracking Experience


Posts tagged with "accessibility"

Vision Accessibility on Apple Vision Pro

I have low vision. A kind you can’t really correct for with glasses or contacts. I also bought Apple Vision Pro at launch. Why would I do this? Well because I’m a nerd who wants to see the future, but also because I was fascinated to see how Apple would handle accessibility for this new product. Apple’s track record on accessibility in the past decade has been stellar, in my opinion, with their teams adding powerful options every year and ensuring every new platform has accessibility support built in from the start.

After watching Apple’s WWDC23 session on visionOS accessibility, I knew accessibility on visionOS was an important point for them. But even after consuming as much information on the platform as I could, I knew I had to try it for myself to know the answer to the important question: how well does it work for me?

Terrific overview of the Accessibility features of visionOS and Vision Pro by Zach Knox.

It’s no surprise to learn that Apple’s Accessibility team did some amazing work for this new platform too, but it’s impressive to see that on day one of the Vision Pro there are already dozens of Accessibility features and accommodations in place. And keep in mind that these are Accessibility options that work with Apple apps and third-party ones, right out of the box. This is the kind of ecosystem advantage and platform integration that newfound tech reviewer Zuckerberg probably forgot to mention in his video.

See also: Tom Moore’s story on trying the Vision Pro with one eye only, Peter Saathoff-Harshfield’s Mastodon thread, Shelly Brisbin’s story for Six Colors, and Ryan Hudson Peralta’s fantastic overview (via 9to5Mac) of using the Vision Pro without hands, which I’m embedding below.

Permalink

Apple Marks International Day of Persons with Disabilities with Video and Ebook

Source: Apple.

Source: Apple.

Sunday, December 3rd is International Day of Persons with Disabilities, and to mark the day, Apple has released a short film directed by Taika Waititi. The film, shot in New Zealand, stars a young girl and a menagerie of fanciful creatures searching for one of their voices. The narrative is told in rhyme in a style that’s reminiscent of a children’s book, which foreshadows the final scene of the video between a father and his daughter, which I won’t spoil. The narrator is Dr. Tristram Ingham, a physician from Taika Waititi’s hometown in New Zealand who has a rare form of muscular dystrophy and uses Personal Voice to narrate the film.

Source: Apple.

Source: Apple.

The video is a touching way of promoting Personal Voice, an important new accessibility feature of iOS 17, iPadOS 17, and macOS Sonoma. Users record 15 minutes of themselves speaking, which is processed privately, on-device, creating a facsimile of their voice. It’s an invaluable feature for anyone who is at risk of losing their voice as a result of a disease or disability.

As Apple explains in its press release, the voice that is created is stored locally on the device on which it is recorded but can be stored in iCloud and shared with up to three devices. If you choose to use iCloud, voice data is end-to-end encrypted. The device on which you record your voice must be password-protected, too.

Source: Apple.

Source: Apple.

Source: Apple.

Source: Apple.

The film released today also highlights Live Speech, a feature of iOS, iPadOS, and macOS that allows users to type responses that are spoken aloud in a system-generated voice or your Personal Voice. You can learn more about setting up Personal Voice and using Live Speech from this Apple Support video on YouTube:

Alongside the video released today is a free ebook version of the story told in the video, which is designed to continue the conversation about voice loss and help people faced with losing their voices.


A Conversation with David Niemeijer of AssistiveWare About Personal Voice, Assistive Access, and Developing Apps for Accessibility

Source: Apple.

Source: Apple.

Earlier this week, Apple announced a series of new accessibility features coming to its OSes later this year. There was a lot announced, and it can sometimes be hard to understand how features translate into real-world benefits to users.

To get a better sense of what some of this week’s announcements mean, I spoke to David Niemeijer, the founder and CEO of AssistiveWare, an Amsterdam-based company that makes augmentative and alternative communication (AAC) apps for the iPhone and iPad, including Proloquo, Proloquo2Go, and Proloquo4Text. Each app addresses different needs, but what they all have in common is helping people who have difficulty expressing themselves verbally.

What follows is a lightly edited version of our conversation.

Let me start by asking you a little bit about AAC apps as a category because I’m sure we have readers who don’t know what they do and what augmented and alternative communication apps are.

David Niemeijer: So, AAC is really about all ways of communication that do not involve speech. It includes body gestures, it includes things like signing, it includes texting, but in the context of apps, we typically think more about the high-tech kind of solutions that use the technology, but all those other things are also what’s considered AAC because they augment or they are an alternative for speech. These technologies and these practices are used by people who either physically can’t speak or can’t speak in a way that people understand them or that have other reasons why speech is difficult for them.

For example, what we see is that a lot of autistic people is they find speech extremely exhausting. So in many cases, they can speak, but there are many situations where they’d rather not speak because it drains their energy or where, because of, let’s say, anxiety or stress, speech is one of the first functions that drops, and then they can use AAC.

Source: Apple.

Source: Apple.

We also see it used by people with cerebral palsy, where it’s actually the muscles that create a challenge. [AAC apps] are used by people who have had a stroke where the brain system that finds the right words and then sends the signals to the muscles is not functioning correctly. So there are many, many reasons. Roughly about 2% of the world population cannot make themselves understood with their own voice.

Read more


Apple Marks Global Accessibility Awareness Day with Features Coming to iOS, iPadOS, and macOS Later This Year

Thursday is Global Accessibility Awareness Day, and as in years past, Apple has previewed several new accessibility features coming later this year. This year, Apple is focusing on a wide range of accessibility features covering cognitive, vision, hearing, mobility, and speech, which were designed with feedback from disability communities. The company hasn’t said when these features will debut in its operating systems, but if past years are any indication, most should be released in the fall as part of the annual OS release cycle.

Assistive Access

Assistive Access. Source: Apple.

Assistive Access. Source: Apple.

Assistive Access is a new customizable iPhone and iPad mode created for users with cognitive disabilities to lighten the cognitive load of using their favorite apps. Apple worked with users as well as their trusted supports to focus on the activities they use most, like communicating with friends and family, taking and viewing photos, and listening to music. The result is a distillation of the experiences of the related apps. For instance, Phone and FaceTime have been combined into a single Calls app that handles both audio and video calls.

Calls, Messages, Camera in Assistive Access mode. Source: Apple.

Calls, Messages, Camera in Assistive Access mode. Source: Apple.

The UI for Assistive Access is highly customizable, allowing users and their trusted supporters to adapt it to their individual needs. For example, an iPhone’s Home Screen can be streamlined to show just a handful of apps with large, high-contrast buttons with big text labels. Alternatively, Assistive Access can be set up with a row-based UI for people who prefer text.

Read more


Apple Marks Global Accessibility Awareness Day by Announcing Upcoming Accessibility Features

Thursday is Global Accessibility Awareness Day. To mark the occasion, Apple has announced a long list of accessibility features coming to its products later this year and shared other ways it is celebrating the day through its apps and services.

Apple’s press release sums up the features coming to the iPhone, iPad, Mac, and Apple Watch as follows:

Using advancements across hardware, software, and machine learning, people who are blind or low vision can use their iPhone and iPad to navigate the last few feet to their destination with Door Detection; users with physical and motor disabilities who may rely on assistive features like Voice Control and Switch Control can fully control Apple Watch from their iPhone with Apple Watch Mirroring; and the Deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Mac. Apple is also expanding support for its industry-leading screen reader VoiceOver with over 20 new languages and locales. These features will be available later this year with software updates across Apple platforms.

Door Detection will be part of the Magnifier app later this year. The feature helps blind and low vision users find and navigate doors when they arrive somewhere. The feature will judge the distance to the door using LiDAR, describe the door’s attributes, like whether it opens by pushing or using a doorknob, and read signs and symbols next to doors.

Apple Watch Mirroring.

Apple Watch Mirroring.

The Apple Watch will add several Physical and Motor accessibility features too. Apple Watch Mirroring, which is built on AirPlay in part, will allow users with physical and motor disabilities to control their Watches from an iPhone using Voice Control, Switch Control, voice commands, sound actions, head tracking, and Made for iPhone switches. The Apple Watch will also add a new double pinch gesture for controlling, like answering and ending phone calls and taking photos.

Apple Watch will add a new double pinch gesture.

Apple Watch will add a new double pinch gesture.

For Deaf and hard of hearing users, Apple will add Live Captions on the iPhone, iPad, and Mac. Apple says captioning will work with video calling apps like FaceTime, streaming video services, video conferencing apps, and in-person conversations. Live Captions occur on-device to preserve privacy, and on the Mac, users will be able to type a response that will be spoken aloud.

VoiceOver will get an update, too, adding the following languages:

  • Arabic (World)
  • Basque
  • Bengali (India)
  • Bhojpuri (India)
  • Bulgarian
  • Catalan
  • Croatian
  • Farsi
  • French (Belgium)
  • Galician
  • Kannada
  • Malay
  • Mandarin (Liaoning, Shaanxi, Sichuan)
  • Marathi
  • Shanghainese (China)
  • Spanish (Chile)
  • Slovenian
  • Tamil
  • Telugu
  • Ukrainian
  • Valencian
  • Vietnamese

VoiceOver on the Mac will also gain Text Checker that will discover formatting issues.

Additional upcoming accessibility features.

Additional upcoming accessibility features.

Apple previewed several other upcoming accessibility features across its products, including:

  • Buddy Controller, the ability for someone to use a second game controller to assist with playing a game as though the two controllers were one
  • Siri Pause Time, which will allow users to customize the period Siri waits before responding to a user
  • Voice Control Spelling Mode, for dictating words letter-by-letter
  • Customizable sound recognition of the sounds in your environment
  • New themes and text adjustments in the Books app for a more accessible reading experience
Apple apps and services are celebrating Global Accessibility Awareness Day, too.

Apple apps and services are celebrating Global Accessibility Awareness Day, too.

Also, Apple has announced that Global Accessibility Awareness Day is being celebrated with Apple Store sessions, an Accessibility Assistant shortcut in the Shortcuts app, special Fitness+ sessions and Time to Walk or Push episodes, an accessibility-oriented Maps guide, and highlighted content on the App Store and in Apple Books Apple Podcasts, Apple Music and Apple TV.

We’ve seen Apple announce accessibility features coming to future versions of its OSes before, but today’s announcement is unique given the number of features revealed. I’m eager to try these features out. Based on what Apple has said, there seems to be a lot here that will make meaningful impacts on a lot of users’ everyday lives.


WordleBot 1.1, Now Fully Accessible with Native Emoji-to-Image Conversion

WordleBot 1.1.

WordleBot 1.1.

Following the release of my WordleBot shortcut last week, I’ve received a lot of useful and informative feedback from users in the accessibility community regarding the shortcut’s ability to annotate Wordle results with descriptions. Although well-intentioned, my original approach was misguided: even with line-based scores, the grid of emoji characters still performed horribly with screen-reading technologies such as Apple’s VoiceOver. WordleBot didn’t do much to make results more accessible for VoiceOver users since it was only reformatting the grid of emoji characters with additional text.

Read more


Apple Announces SignTime and Many Other Accessibility Features Coming to Its Products

Source: Apple.

Source: Apple.

Tomorrow, Apple will launch SignTime, a service that will pair Apple Store and Apple Support customers with on-demand sign language interpreters. The company has several other accessibility features coming too, which will be released later this year.

SignTime will allow customers to communicate with AppleCare and Retail Customer Care inside their browsers using American Sign Language, British Sign Language, and French Sign Language. The service will also be available in-person at retail stores without making arrangements ahead of time. For now, the service is limited to the US, UK, and France, but Apple says it will roll out to more countries over time.

Later this year, Apple will add AssistiveTouch to the Apple Watch. The feature uses hand clenches, pinch gestures, and hand shaking to navigate and select controls in Watch apps. Assistive Touch for the Apple Watch takes advantage of the device’s gyroscope and accelerometer along with the heart rate sensor and machine learning.

Also, coming later this year, the iPad will gain support for third-party eye-tracking devices to assist users in navigating the iPad’s UI. VoiceOver is also being enhanced with new details about people, text, table data, and other objects. The feature will offer far more descriptive information for blind and low vision users than ever before. Users will also be able to add their own image descriptions to their photos using Markup.

For deaf and hard of hearing users, Apple is adding bi-directional hearing aid support. The company is also including support for audiograms, which can be used with Headphone Accommodations to tune playback to a user’s hearing. Background sounds like balanced, bright, or dark noise, and ocean, rain, and stream sounds are being added too.

Source: Apple.

Source: Apple.

Also coming later this year are Sound Actions for Switch Control that uses mouth sounds in place of switches and buttons, customizable display and text size settings for colorblind users, and new Memoji customizations to allow users to add oxygen tubes, cochlear implants, and a soft helmet for headwear. Many of Apple’s apps and services will also be marking Global Accessibility Awareness Day, which is tomorrow, with features, curated collections of content, and sessions.

It’s fantastic to see such a long list of accessibility features being added to Apple’s devices tomorrow and later in the year, which presumably means this fall when the company’s OSes are expected to be updated. It’s also unusual for upcoming OS features to be revealed before WWDC, but with Global Accessibility Awareness Day coming tomorrow, it’s an excellent opportunity to spotlight these important additions to Apple’s OSes and perhaps to free up time at WWDC for other announcements.


Apple to Host Interactive Accessibility Event Online as Possible Warm-Up for WWDC

As noted by Steve Troughton-Smith on Twitter and reported by 9to5Mac, Apple has invited developers to attend an online event to learn about the accessibility features of its devices. According to an email message sent to developers, the event will include opportunities to ask questions during and after the presentation and schedule individual consultations.

Last month, Apple announced that WWDC will be online-only this June for the first time. As Troughton-Smith suggested in his tweet, it’s not hard to imagine that Apple is using this week’s accessibility event to test systems that it will use to move WWDC online.

Apple periodically holds events for developers outside the annual WWDC cycle, but this event is a little different, especially the interactive component. I’m curious to see how the accessibility event goes and the mechanics Apple uses to implement developer participation.


For iPad, Accessibility Gives ‘It’s Just a Big iPhone’ New Meaning

Perhaps the most common complaint hurled against the iPad over its first decade of life is that it‘s little more than a bigger iPhone. At a fundamental level, the criticism is certainly valid: by and large, the iPad runs the same software as the iPhone. The penchant for bemoaning this bigness emanates from discontentment over the fact that substantial improvements to the iPad’s software have come at a glacially slow pace. Until last year, meaningful upgrades tailored to the tablet were few and far between.1 As much as Apple has extolled the iPad for being “unlike any computer,” the truth is the product stagnated for quite a while in terms of software.2 For better or worse, the company has been preoccupied with savoring every last drop of mother’s milk from the cash cow that is the iPhone. The iPad was left to wither thirstily when it came to its own growth, and it suffered for some time as a result.

In actuality, the iPad being more or less a scaled-up iPhone isn’t necessarily an entirely bad thing. The reason is iOS; familiarity breeds comfort – Apple shrewdly created the iPad’s user interface (and to lesser extents, Apple Watch and Apple TV) to largely resemble the iPhone. Especially for less nerdy users, the consistency across devices makes for a seamless, less intimidating experience. From icons to text to features to the touchscreen, the iPad being so similar to the iPhone means acclimating to the device takes minimal time and effort. From an accessibility standpoint, easy acclimation sets the tone for an enjoyable user experience. The foremost reason this is important is that the easier it is to acclimate to a device, the easier it is to find and configure mission-critical accessibility features.

Thus, it’s not at all unreasonable to look at what was heretofore a pejorative assessment – the iPad is nothing but a big iPhone – and turn it into a positive. One of the unheralded aspects of the device’s success is how its approachable, intuitive nature has made it a hit in accessibility-centric contexts such as special education classrooms and as a communicative aid. Such advances get right at the heart of the oft-cited Steve Jobs quote on the so-called intersection of technology and the liberal arts, when he said, “It’s in Apple’s DNA that technology alone is not enough.” Assistive technology obviously caters to the humanities part of the liberal arts, and it’s not hard to see how the iPad’s roots as ostensibly a bigger iPhone can be an asset rather than a liability. You just have to be willing to keep an open mind.

Read more