A European patent application that shows a possible future iPhone implementation with both facial and fingerprint biometric identity scanners is making the rounds, prompting a lot of goofy bad headlines but also giving us a glimpse into what I really do think is the inevitable future of digital security.
Spoiler: It's not quote-unquote getting Touch ID back, at least not in the conventional sense. Touch ID is the past. Apple burned that boat behind them to make sure everyone on every team had no choice, and no fallback, but to make Face ID work. But Face ID isn't the future either. It's simply the present.
Touch ID was faster and more convenient than passcode. Face ID doesn't require contact and so feels almost transparent, like no authentication at all.
There are a few times, though, when your finger moisture has changed or you're wearing gloves, or the sun is behind your at just the wrong angle or you're wearing ski gear, where "it just works" just stops working. It's not often, but it's enough to shatter the illusion and make you want something even faster than Touch ID and even more transparent than Face ID.
To make you want the future of persistent, passive biometric authentication.
The future of authentication
Imagine a future iPhone where authentication doesn't require a specific fingerprint or facial geometry scan, or biometric challenge/response. But, instead, it was continuously grabbing snippets of biometric and other data. And imagine it would use that data to maintain a state of "trust" where your iPhone is simply unlocked for as long as it can be reasonably (or strictly, depending on settings) certain it's in your possession, challenging only when that state becomes uncertain.
Other vendors are already incorporating Touch ID-like sensors into the capacitive displays, rather than a discrete capacitive home button. There are also patents for microLCD technology that further enhances screen-as-fingerprint reading. In the future, certain areas — or even the entire iPhone display — could be able to pull at least partial fingerprint data every time you touched it.
Face ID is already doing full-face geometry scans with neural engine processing to unlock iPhone X. It seems almost trivial that the TrueDepth camera could grab at least partial facial geometry each and every time you looked at a screen.
Siri began doing the basics of Voice ID a couple years ago. Now, when you use setup buddy on a new device, it has you say a few simple phrases so it can distinguish your voice — and your voice queries and commands — from those of others. I don't believe it's robust enough for authentication yet, but companies like Nuance have been offering just those kinds of "my voice is my passport, authorize me" services for a while. It's not tough to see Apple using the multiple, beam-forming mics on iPhones and AirPods to constantly check for your voice either.
Apple's A-series processors also contain M-series sensor fusion hubs. Right now that's used for things like health and fitness apps and games. Taken further, though, gait-analysis could be used to record and check your walking and motion patterns, so as you move around your iPhone can know it's you that's doing the moving.
Biometric data could also be supplemented by other factors, like trusted objects. Previously, trusted objects were dumb — grab someone's dongle and you got into their phone. With Apple Watch, though, trusted objects got smarter. Auto Unlock on macOS, which uses the proximity of your Apple Watch to authenticate you for your Mac, feels downright magical. You authenticate on the watch via passcode or Touch ID on iPhone, then that authentication is further projected from Watch to Mac.
So could environmental data. For example, if you're in a certain place at a certain time that fits your existing patterns, that could add to the trust weighting.
Taken separately, each of these authentication methods either requires user action or doesn't provide enough security to be useful. Taken together though, every touch of the display provides a partial print, every glance at the camera provides a partial face or iris scan, every word a partial voice print, every step a partial gait analysis, and if a paired Apple Watch is proximate and you're in a place, at a time, that fits your pattern, enough factors pass authentication and the moment your iPhone senses any engagement, it's already unlocked and ready to be of service.
Conversely, any time enough factors fail authentication, your phone goes into lockdown and challenges for a proper fingerprint, iris scan, or passcode/password to make sure you're really you. And it could escalate for situations that warrant it. That's what happens today, for example, after a reboot, timeouts, software updates, etc. For secure enterprise or government use, it could do so more often and require multiple factors to resume a trusted state.
Not if, when
We'll need considerable advances in battery chemistry and strict adherence to privacy policies to enable this kind of technology, but Apple is uniquely positioned to deliver both. Just like chipsets, they don't have to worry about acting like a battery vendor, and unlike data harvesting companies, they don't want or need any of the personal information this surfaces.
To me, arguing about whether or not Touch ID or Face ID are better or if Touch ID is coming back misses the point. Touch ID isn't there for Touch ID's sake. Face ID isn't there for Face ID's sake. Both are solutions to the same problem and, in the future, there will either be still better, faster, and easier ways to solve that problem. Or simply make it disappear so it no longer needs solving.
Historically, that seems like the approach Apple takes. And that's why I think it's not about whether we see passive, persistent authentication — but when.