Ian Mackay has a home full of smart gadgets—from lights to fans to window shades—and runs them all from his iPhone and HomePods. “I use Apple technology for pretty much everything that I can,” he says. But even though he’s an Apple power user par excellencehe’s never had an Apple Watch.
It hasn’t been for lack of interest. In 2008, a bicycle crash left Mackay paralyzed. Other than shrugging his shoulders, he can’t move his body from the neck down. That’s why voice-controlling all that equipment is so critical to his life. But it’s also why a smartwatch didn’t seem to have a practical place in it.
“I’ll tell you, I’ve been jealous of my girlfriend or my mom, just being able to have sleep tracking or biometrics,” he says wistfully. “That stuff would be really valuable to me.”
Before long, Mackay and others in similar situations will have the opportunity to use an Apple Watch in a way tailored to their needs. Apple is working on a feature called Apple Watch Mirroring that will provide access to the watch—apps, health sensors, and all—from a paired iPhone, with the watch interface replicated on the phone. That will allow users to call on familiar iPhone accessibility features to control an Apple Watch, including Voice Control as well as Switch Controlwhich provides alternative means of input such as external switches and the phone’s front-facing camera.
Apple isn’t the only tech giant that’s doing important work in accessibility. But there’s something particularly Apple-esque in how Apple Watch Mirroring leverages multiple products in the Apple ecosystem to create an experience that wouldn’t be possible if the company weren’t responsible for all its own software and hardware. People who are already comfortable using accessibility features to control an iPhone without touching it will be able to apply the same skills to the Apple Watch.
“As soon as they see this, the opportunities are going to click, because they don’t have to do any additional work to learn Voice Control or Switch Control on Apple Watch,” says Sarah Herrlinger, Apple’s senior director of accessibility policy and initiatives . “We’re leveraging powerful connectivity [so] it just works with what they’ve got already.”
Apple Watch Mirroring is one of several accessibility enhancements that Apple is previewing to mark Global Accessibility Awareness Day, which takes place on Thursday, May 19. Live Captions, for example, will auto-transcribe audio on the fly on iPhones, iPads, and Macs, making apps such as FaceTime more useful for people who can’t hear them. (It’s similar to an existing Google feature.) And Door Detection, an upcoming addition to the iPhone’s Magnifier app, will use the phone’s camera to assist blind and low-vision users as they approach doors, announcing how distant an entrance is, reading signage such as operating hours, and even explaining Whether opening a door involves turning a knob, pulling a handle, or some other action. Apple says that these and other accessibility updates will be available later in 2022.
37 years of accessibility
For Apple, investing in making its products useful to people with disabilities is nothing new: The company created its first team dedicated to accessibility in 1985, the year after the original Mac debuted. But the modern era of accessible Apple design dawned in 2009, when the iPhone got features such as VoiceOver, a talking interface optimized for users who can’t read text on a screen. A lot has happened since then, especially as Apple has entered more and more product categories.
From the start, the Apple Watch offered VoiceOver and a zoom mode that helped make display elements more legible. In 2016, it got fitness tracking designed for people who happen to be using a wheelchair rather than walking or jogging. And last year, Apple introduced AssistiveTouch, which allows for one-handed input. Using the watch’s accelerometer, gyroscope, and optical heart rate sensor, the technology can detect gestures—such as pinching and clenching—that someone makes using the hand on the same arm that the watch is on. (Among the new accessibility features coming later this year are AssistiveTouch-based “Quick Actions” such as making a phone call, playing media, or taking a photo with a double pinch.)
According to Herrlinger, the idea that became Apple Watch Mirroring originated with a user whose cerebral palsy presented challenges when it came to wearing an Apple Watch. “As somebody who is a user of Switch Control, he has been incredibly active using all of our iOS devices—iPad, Macs, and Apple TV,” she says. “But this was the one place that he still couldn’t take advantage of all the Apple ecosystem provided.”
There’s a lot of iteration as we go back and forth with individuals who are daily users of this technology.”
Sarah Herrlinger, Apple
As with many accessibility features, Apple found that it already had some of the capabilities it needed. AirPlay, the wireless tech that powers features such as the ability to stream video from an iPhone to an Apple TV, will provide the basis for Apple Watch Mirroring’s watch-to-phone connectivity.
Real-world testing is critical to getting any accessibility feature right. “There’s a lot of iteration as we go back and forth with individuals who are daily users of this technology to make sure that what we’re building works well for their unique circumstances,” says Herrlinger. Mackay, who Appeared in a 2019 Apple commercial, is one such daily user. The company also works with organizations such as the Challenged Athletes Foundation, which introduces it to testers who can provide input on features in development. “We love that we’re in continued conversation with Apple, and we can continue to be a resource for them as they continually evolve their technology,” says CAF chief executive director Kristine Entwistle.
For the people who depend on the accessibility features built into Apple products, the benefits are about potential that goes beyond getting mundane everyday tasks done. Mackay’s arduous recovery from his 2008 bike crash led to depression—and the technology available to him didn’t improve matters much. “When I was first injured, the only thing I had access to was a BlackBerry with a wired earpiece,” he says. “I think I had two or three commands I could do. It would tell me how much signal I had and how much battery life was left. And I think I was able to answer a call, but I couldn’t make a phone call.”
Then the iPhone and its baked-in accessibility came along. “Once I got access to that phone, I gained some confidence to go out,” Mackay says. “And that is where I found my solace.” In 2016, he rode his power wheelchair on a long-distance adventure across his home state of Washington; that inspired him to found Ian’s Ridea nonprofit dedicated to helping people with mobile impairments enjoy the outdoors through assistive tech and infrastructure such as wheelchair-friendly trails.
For all of the further transformative potential of something like Apple Watch Mirroring, Apple’s Herrlinger is quick to emphasize that accessibility is never going to be about any one feature, because needs vary so much. “We don’t see it as a checkbox,” she says. “We see it as a continuum where every user’s use of technology is unique.” And ultimately, it isn’t a different continuum than the one the company thinks about when it comes to other aspects of product design. Instead, it’s just a broadening of the philosophy it’s had all along.
“We love what we make,” Herrlinger says. “We want everyone to love what we make.”