Everyone acknowledges the societal and technological effects the iPhone has had on the world. In late 2007, Time named the original model its "invention of the year," and rightfully proclaimed it "the phone that changed phones forever." Eleven years on, it is genuinely difficult to remember the world before the iPhone existed. Whatever your platform allegiance, there can be no disputing that the first iPhone pioneered the notion that everyone should carry a touchscreen supercomputer with them wherever they go. In hindsight, Steve Jobs wasn’t exaggerating when he boasted Apple would reinvent the phone.
Yet for everything the iPhone has meant to smartphones and to the world, there is a segment of users for which the iPhone has been truly revolutionary: disabled people. For many people with disabilities, myself included, the iPhone was the first accessible smartphone. The device’s multitouch user interface and large (for the time) display represented a total break from the smartphone conventions of the day. An unheralded ramification of this was how accessible these features made the iPhone. For example, the soft keyboard allowed users to compose text messages and emails without struggling with the T9 keyboards that were commonplace at the time. Likewise, the iPhone’s 3.5-inch display was considered large for the day, which made seeing content markedly easier than on the postage stamp-sized displays that dominated cell phones then. It’s a testament to the original iPhone’s greatness that its fundamental components were so solid that they redefined accessible computing, all without being "accessible" in the traditional sense. Its impact is put into greater perspective when you consider the first two versions of iOS (née iPhone OS) didn’t contain discrete accessibility features. The first bunch, VoiceOver, Zoom, and Mono Audio debuted in 2009 with the 3GS.
Like all Apple products, macOS is an accessible platform. Blind and low vision users can navigate their Mac using VoiceOver, while someone who has physical motor delays can use Switch Control to edit videos in Final Cut. And under the Accessibility pane in System Preferences, there is a multitude of other features one can use, ranging from Zoom to Invert Colors to closed-captioning and more. Whatever your need, the breadth and depth of Apple’s accessibility software spans many domains. This is why Apple is lauded as the industry leader in accessibility: the tools run deep and they’re well-designed.
Still, accessibility on macOS doesn’t quite reach feature parity with iOS. Amidst rumors that Apple is working on a cross-platform set of APIs to bridge the company’s two primary operating systems, now is an opportune time to consider what each platform does and what they offer one another.
In the context of accessibility, the way Apple brings consistency between iOS and macOS is by sharing features and technologies among the two. As such, there are some iOS-first features macOS sorely needs, while the Mac offers things iOS would benefit from as well. Such enhancements would not only improve the user experience across devices, but also would make iOS and Mac software richer, fuller products overall. And most importantly, more accessible.
People talk about how an Apple product such as the iPhone having a halo effect on customers. If you buy an iPhone and like it, the theory goes, you're more inclined to buy another Apple device, like a MacBook. This theory has certainly proven true in my experience – since buying my first iPhone (my first Apple product) in 2007, I've bought numerous other Apple products and subscribed to numerous Apple services in the subsequent years. Put another way, I was entrenched in the Apple ecosystem long before I started covering the company for a living.
Recently, a different kind of halo effect has settled on me. I've been using an iPad Pro for the past several weeks, and absolutely love it. Like Federico, the iPad is my computer of choice because of my deep familiarity with iOS and the ways in which working from a touchscreen device makes computing more easily accessible.1 Coming from my old iPad Air 1, iPad Pro has intensified my affinity for the iPad and iOS in general. It has impressed not merely by its technical or software merits, but by one seemingly obvious thing: its screen.
To my knowledge, the release of Night Shift in iOS 9.3 is only the second time in recent history Apple has updated iOS to include a change or feature that has potential accessibility ramifications. The other occurrence, in my mind, was iOS 7.1 beta 2, released in 2013. In it, Apple added a Button Shapes option to Accessibility as a way to assuage users who have trouble distinguishing an actionable button from a text label. Generally, however, any significant additions or changes to the Accessibility feature set comes included with a major new version of iOS. That is to say, the version Craig Federighi talks about at the annual WWDC keynote.
Before getting into Night Shift's accessibility merit, it's worth examining why it exists. The impetus for Night Shift is better sleep. Apple explains in its marketing material for iOS 9.3 that a person's circadian rhythm can be disrupted by the "bright blue light" emitted from an iPhone or iPad's screen, making it difficult to fall asleep. What Night Shift does to combat this, according to Apple, is "use your iOS device's clock and geolocation to determine when it's sunset in your location." After gathering that data, the software then "automatically shifts the colors in your display to the warmer end of the spectrum." The end result is a display that's easier on the eyes, thus hopefully making it easier to fall asleep. (The display settings will revert to normal in the morning. There's an option to schedule Night Shift as well.) For more on why Night Shift is important and how it works, iMore has posted a good explainer on the feature.
I've made the case more than once that accessibility, conceptually, is not a domain exclusive to the disabled. Certainly, persons with disabilities will always be the target market for accessibility features, but I think many fully-abled people overlook the fact that accessibility features can help them too. To me, the canonical example is larger text. Yes, something like Large Dynamic Type is a boon to the visually impaired, but it can also benefit someone with aging or tired eyes.
In a similar vein, accessibility isn't solely about discrete Accessibility features. While a big part of my writing involves reporting on iOS' (and watchOS') Accessibility features and how they affect users, I do make an effort to focus and write on the smaller aspects of accessibility. That is to say, I try to find accessibility in less obvious places – for instance, how technologies like Touch ID and Force Touch impact the disabled.
This concept has extended to my testing of the iOS 9 public beta throughout the summer. As I've gotten used to the new operating system on my iPhone 6 and iPad Air, I've come to notice several details that aren't intentionally for accessibility, but nonetheless make the experience more accessible (and more enjoyable).
With that in mind, here are five "little things" in iOS 9 that stand out the most.
Last summer, I wrote an article for iMore in which I stress the importance of looking at hardware in accessibility terms. I wrote, in part:
Assessing the kinesthetic, tactile value of using an iPhone or iPad is just as important as assessing the software it runs. Speaking from personal experience, not only am I visually impaired but I also suffer from a mild form of cerebral palsy. What this means is, for me, the strength in my hands and fingers are substantially less than that of a fully-abled person. Hence, it takes much more effort to hold things — in this case, my iOS devices — as well as do things on my devices, like typing. Because of this, my approach to buying a new iPhone or iPad depends not only on 64-bit systems-on-a-chip and improved cameras, but also how the device feels in my hands: the weight, width, thinness, etc.
What applies to iPhones and iPads also applies to Apple Watch. In the context of the Watch, the hardware that is most crucial, accessibility-wise, are the bands. To folks like me who suffer from motor delays, the ability to successfully get the Apple Watch on and off is as key to a positive user experience as the quality of the software it runs.
Aside from keeping our iPhones in our pocket more, I think the Apple Watch is compelling for another reason: communication. The ways in which Apple is allowing people to communicate via Apple Watch – taps, doodles, and, yes, even heartbeats – is a clever, discreet new paradigm that epitomizes the company’s mantra that the Watch is the most intimate and personal device they’ve ever created. I, for one, am very much looking forward to trying these features.
What’s even more compelling, though, in my view, is the engine that’s powering the delivery of said communication – namely, the Taptic Engine. Beyond its use for notifications and communication on the Watch, Apple has implemented its Taptic Engine in one other form: trackpads. Apple has put the tech into the new MacBook and the refreshed 13-inch Retina MacBook Pro. I had an opportunity to play with the Force Touch trackpad (about 30 minutes) at my favorite Apple Store here in San Francisco, and came away very, very impressed.
I find Apple’s embrace of haptic feedback fascinating and exciting, because the use of haptic technology has some very real benefits in terms of accessibility.
In the overview I wrote for MacStories of the new Accessibility features of iOS 8, I said this about the operating system’s support for third-party keyboards:
This topic (as well as QuickType) is worthy of its own standalone article, but the accessibility ramifications of iOS 8’s third party keyboard API are potentially huge for those with special needs.
Four months later, that statement still holds true.
Over the past few weeks, I’ve been using two such keyboards — Fleksy and Keedogo — in an effort to not only simply test-drive a new feature, but also to gauge the accessibility merits of the keyboards. While I’ve found Fleksy and Keedogo to be fine in terms of accessibility, even in the midst of testing them, I’ve found myself going back to Apple’s stock keyboard. In the end, that I don’t use any third-party keyboard as a replacement for the default one on my iOS devices is no fault of either developer — it’s Apple’s.
Before explaining why it’s Apple’s fault, though, it’s important to discuss the virtues of Fleksy and Keedogo.
Soon after WWDC ended in June, I wrote a piece for MacStories in which I briefly summarized all the new features Apple added to Accessibility in iOS 8. Since then, I’ve spent a lot of time with the iOS 8 beta builds over the summer, and have found several features and little touches – even some things that aren’t specific to Accessibility – that merit a more detailed look. What follows are my personal impressions of such.