Apple Watch, WatchKit, and Accessibility

Ever since rumors started swirling that Apple was working on a wearable device, I've often thought about what such a device would mean for people with disabilities. My curiosity is so high, in fact, that I've even written about the possibilities. Make no mistake, for users with disabilities such as myself, a wearable like the Apple Watch brings with it usage and design paradigms that, I think, are of even greater impact than what the iPhone in one's pocket has to offer.

Suffice it to say, I'm very excited for Apple Watch's debut sometime next year.

Apple's release of WatchKit (opens in new tab) to developers has heightened my excitement and curiosity for the Watch, because the tools offer a clearer glimpse into what the device is capable of doing. Specifically, I'm very keen to find out what, if any, Accessibility software is included. We know Siri is integrated into the Watch, but what of VoiceOver or Dynamic Type? Surely with a screen is so small, Apple has to build in a certain level of accessibility for visually impaired users. Furthermore, how will Apple accommodate users with fine-motor challenges, who may have trouble manipulating the digital crown? Edge cases like this should be taken into consideration as well.

I spent some time looking through the Apple Watch SDK, but I hadn't found any references to Accessibility. After asking around on Twitter, I was alerted to a WKInterfaceObject string of code that allows for Accessibility and localized text. I'm no computer programmer so I'm unable to fully understand what that means, but my low-level take is that, yes, Apple has baked in some Accessibility features into the Watch's operating system. What those features actually are remain to be seen, but it's comforting nonetheless to know that the Watch will be accessible, more or less.

Even with WatchKit, there's still much that we don't know about the Watch's functionality --- or, for that matter, the S1 computer module that powers it all. Still, given what is known at this point about the Watch and how it works, I think it's safe to make some educated guesses on how the Watch's hardware and software will impact accessibility. To that end, I believe there are two facets of the Watch that have the potential to be huge usability wins for the disabled.

First, as demonstrated in the introductory video, is raise-to-wake. As Jony Ive describes, Apple Watch senses when the user is raising his or her wrist, and automatically turns on the display. This is great for someone like me who, being visually- and motor-impaired, might have issues with finding and pressing a button to wake the screen. As someone who used to wear digital watches in the past, I can attest that locating and pressing a physical button to turn on the screen is a real bother. (What's worse, sometimes physical buttons have multiple functions.) That all I need to do with Apple Watch is raise my wrist to bring up the display makes it that much more accessible, because it removes the friction (e.g., finding/pressing a button) in the otherwise mundane task of checking your watch.

Secondly, the "Taptic Engine" that Apple uses to deliver messages to the wearer (via a subtle tap-on the-wrist sensation). The subtle tapping on the wrist, I think, will be huge for the visually impaired, insofar that it allows for another mode of notification. That is, it could become problematic to always check your watch — not only is it socially inappropriate, but constant checks on such a small screen may cause increased eye strain. It's a small detail, to be sure, but I know from personal experience how much better I feel by conserving my visual energy, so to speak.

Moreover, for hearing impaired users, the Taptic Engine could be beneficial in that it would replace auditory feedback; this effect would be similar to the adaptive technology used by the deaf. Growing up with deaf parents, our doorbell had not only a ring, but also a lightbulb attached so that whenever the doorbell rang, it flashed light too. The same applied to our telephone; it even extended to the alarm clock in my parents' bedroom: when the alarm went off every morning, not only would there be an audible buzz but the bed would vibrate. That's how my parents knew it was time to get up.

As I said at the outset, I'm very excited for the Apple Watch. Although it's too early to truly tell how well its software (opens in new tab) and typeface will impact accessibility --- the thing isn't even out yet! --- I'm cautiously optimistic that the Watch will be a substantial improvement over adaptive aides like the classic talking watch, in functionality and in fashion. It'll be very interesting to see over the coming months what more Apple reveals regarding the Watch, especially when it comes to accessibility.

Steven Aquino

Steven is a freelance tech writer who specializes in iOS Accessibility. He also writes at Steven's Blog and co-hosts the @accessibleshow podcast. Lover of sports.

6 Comments
  • Hi Slightly off-topic, but as a developer it would be interesting to hear what features in an app really make a difference to visually-impaired users. Which things make an app unnecessarily difficult to use, and which make it easier- especially if this is relatively simple to implement (for instance, some apps may have better speech information than others, or a better way of helping the user navigate through an app.) I hate to think there may be something simple I could do that would increase accessibility if only I knew about it, and I haven't seen much coverage of this in the blogs I usually read, so I thought it would be a good topic for an article on iMore. I have tried using voiceover myself, but it really takes some learning, and the experience only increased my respect for those who have to use it all the time. Peter
    Soluble Apps
  • These new scrolling ads are horrific. Please find another way to advertise. Simply not acceptable.
  • I am a beginner developer and I see the profile of Taptic Engine to be a chance to really push the boundaries of Apps for all. I am often heard discussing my UI/UX from the view of 'well what if we cover the screen of the device' what would I include now? The point I am trying to portray (perhaps awkwardly) is that I could inform the User that there is something approaching or we are approaching something through a sequence of 'haptic' feedback 'taps' or aural sounds. When I have started this conversation the feedback is often - 'I didn't think of that'. When we come back to the information on a screen I often talk about the 'visual' cues not the information that we are use to: does a runner need to see the detail of his heart rate or could they get away with just seeing a colour 'green' = within range, 'amber' under/over etc. This article is great because it is a topic that I think will get bigger as App Developers explore more ways of making use of our other senses in there apps. I guess in some ways as the device gets smaller (iPhone 6plus to Apple Watch) accessibility becomes more prominent.
  • Somehow I believe that operating the watch with a crown is a bad design decision. I believe that the swipe gestures of Android Wear are the far better concept.
  • Watch still utilizes swipe very much so. (See the videos on apple dot com) The crown replaces pinch-to-zoom. Sent from the iMore App
  • What about Morse code to read out texts? Sent from the iMore App