Apple is hailing the forthcoming iPhone X (opens in new tab) as "the future" of the smartphone. And futuristic it is: It has an edge-to-edge display, wireless charging, facial recognition, augmented reality, and is driven almost completely by gestures. As with so many of the products to come out of Cupertino, to own an iPhone X is to own a piece of the future today.
I was in the audience at the glistening Steve Jobs Theater as Phil Schiller blitzed through the equally impressive iPhone 8 only to return to the stage moments later to discuss the iPhone X. As its features were being explained and demonstrated, one thought that persisted in my mind was how accessible this glimpse of the future is going to be. The iPhone X, in some ways, is a radical departure from the iPhone we've come to know and love for a decade now, and surely accessibility is going to be a question.
Apple says iPhone X isn't shipping until November 3, and I had only about a minute with the device in the hands-on area following the presentation. Until then, we really can only speculate on how the iPhone X is going to perform. That said, it's nonetheless still worth considering the usability of the new phone, particularly from an accessibility point of view.
That big, beautiful OLED display
According to Apple (opens in new tab), iPhone X has a 5.8-inch "Super Retina Display" that has a resolution of 2436x1125 at 458 pixels per inch. By contrast, the iPhone 7 Plus I've been using for the last year has a "Retina HD Display" of 1920x1080 at 401 pixels per inch. (The iPhone 8 Plus share these specs.) My 7 Plus still has a beautiful display, but the iPhone X's OLED screen makes a noticeable difference in quality. Pixel density is one thing, but the new iPhone's screen is markedly better in every way. It's bigger, brighter, sharper, and more vivid. In my brief time with it, I was struck by how nice the X's screen is.
I've written before about how Retina displays make the smartphone experience more accessible. The gist of it is the brighter and sharper a screen is—be it on an iPhone, iPad, or Mac—the easier it is on my eyes because I don't strain as hard to see. Less strain means less fatigue (and pain), which ultimately translates to a better experience as a visually impaired user. Even in my brief moment with it, I immediately could tell the difference the iPhone X's OLED screen makes. Its characteristics, coupled with the edge-to-edge design, make it the best display I've seen yet. I'm excited to get more time with the iPhone X, because that screen is a stunner.
Another nicety about the iPhone X's screen is True Tone. Introduced with the 9.7-inch iPad Pro in 2016, True Tone is making its iPhone debut on the X. I've had True Tone on my 10.5-inch iPad Pro, and it's wonderful. I notice the screen adapting to different lighting conditions, getting warmer and cooler where appropriate. In terms of accessibility, True Tone is another of Apple's screen technologies that make the viewing experience better. The effect isn't as dramatic as the advent of Retina in 2010, but it's an added layer that makes content feel nicer. Anything that helps my vision is a win in my book, so I'm glad to have True Tone coming to the iPhone.
Facing the Future with Face ID
In the wake of the event, I've gotten a deluge of questions from blind and low vision people on Twitter about Face ID. Many of them have expressed trepidation over how accessible Apple's new facial recognition system will be since Face ID requires you look at the screen, and that can be difficult or even impossible for many who have little or no sight.
Allow me to allay everyone's fears. I spoke with people at Apple about Face ID after the presentation ended and was assured Face ID—like everything else Apple makes (opens in new tab)—was built with accessibility in mind. You'd expect it to be, of course.
Specifically, Apple told me there are three parts to this.
First, Face ID is fully integrated with VoiceOver. If someone relies on the screen reader, it will guide she/he through the process of face-scanning. There are cues on when and how to move one's face during the setup procedure. If you've used the iOS Camera app with VoiceOver turned on—which identifies things like face detection and objects in a shot—then you'll instantly feel at home. Face ID and VoiceOver work similarly.
Second, there's an accessibility option on the setup screen to force Face ID to perform the depth-mapping using only a single shot. Tap the button and instead of using multiple shots at various angles, your face will be scanned using a single image. This is useful if you're someone with limited or no movement in your neck; you can still benefit from Face ID by enabling this option. It's great that you can do so from right within setup rather than going into Settings first.
Lastly, there is an option under Accessibility that, if turned on, tells Face ID not to look for attention. This is useful insofar as many blind and visually impaired users cannot look directly at the screen to trigger Face ID. There is one caveat to the feature, however. In an interview with TechCrunch's Matthew Panzarino on how Face ID works, Apple's Senior Vice President of Software Engineering Craig Federighi said there is a "compromise" in not using attention detection. Panzarino notes users who opt to forgo detection still can use Face ID, but the trade-off is "a lower level of overall security" because their eyes aren't looking at the screen. (Face ID needs to be able to see your eyes, nose, and mouth to scan.)
"You can turn off attention detection as a user," Federighi told TechCrunch. "There's some compromise to detection there—but if you have a condition where you can't look at it, that's the choice you have."
Home Is Just a Swipe Away
Apple has jettisoned the Home button in iPhone X, an aspect of the iPhone which has become iconic branding-wise, in favor of a software solution. Instead of pressing a button, you now use a gesture. Swiping up from the bottom in any app will cause the app to "fall back" into its icon the Springboard.
For accessibility, this shift isn't totally unprecedented. For years, the AssistiveTouch feature has had a virtual Home button for users who can't physically press a tactile button. While this isn't directly analogous to the swipe-to-go-Home gesture on iPhone X, the assumption is similar: the Home button doesn't exist, whether practically or literally.
Also consider Switch Control. Switch Control's entire reason for being is to help people who can't physically touch their device(s) navigate them. I have yet to confirm this with Apple, but I imagine Switch Control on iPhone X supports the Home button gesture. (The same should be true for accessing Notification Center and Control Center from the top corners.)
For myself, as someone who's become accustomed to swiping up to invoke the iPad's Dock on iOS 11, I don't foresee the iPhone X's swipe-to-go-Home gesture to be an issue for me.
AirPower, Apple's wireless charging mat that's due sometime next year, was a smaller but not insignificant announcement. From an accessibility perspective, I'm excited for it because it'll rid me of plugging in my three most-used products (iPhone, Apple Watch, and AirPods) in order to charge them.
This goes back to what I wrote last year prior to the iPhone 7 coming to market. The story for AirPower is the same: Not having to fiddle with a cable to charge saves me precious visual and fine-motor energy. Losing the headphone jack meant I gained AirPods, which I am utterly smitten with. They have completely changed the way I listen to music and podcasts; the case is genius. Likewise, the arrival of AirPower will enable me to simply lay my iPhone (or whatever) down on the mat and wait for the chime. (The sound you hear when you plug in your phone is a great audible cue that it's plugged in and charging.) And when I'm ready to leave home, I can pick up my phone and go. No more having to detach the Lightning cable.
You might not think of AirPower as an assistive tool, but it totally can be. If you, like me, have physical motor delays, even something as mundane as inserting and removing your power cord can prove frustrating at times. Thus, the ability to just set my devices down on the mat makes charging more accessible.
This article is awesome. It sounds like Apple really considered accessibility in their latest releases. I really didn't think about the fact that blind and low vision people would have a massive gain with wireless charging.
Why do I feel like apples innovation has hit a wall. Oled and wireless charging and the phone is 1000$? That’s stuff has been out for a while now. It’s only new to apple! I’m on my 6s plus and looking for an upgrade but I can’t see one in this years apple line up. My phone is fast and takes decent pictures. Battery taking a hit but I can’t see my self buying the iPhone as it looks just like my 6s. And will probably feel the same it would be in my eyes paying a lot for a new battery. Guess I’m just bored of the already boring iOS.......
""Its characteristics, coupled with the edge-to-edge design, make it the best display I've seen yet. I'm excited to get more time with the iPhone X, because that screen is a stunner.""" If you're only used to iPhone screens then I'd agree it's the best display you've even seen in an iPhone.. thanks to Samsung.. But if you meant in any smartphone then I'd like to correct you sir. Galaxy 7 and S8 have the best display only iPhone can dream off. Let alone galaxy note 8.. research more before you mislead people..
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.