Despite other vendors starting to ship in-screen biometric fingerprint identity scanners, Touch ID isn't coming back. What smaller batch vendors can do a year later isn't a great indication of what Apple could do a year ago at a scale of hundreds of millions of users whose expectations were set by the speed and reliability of Touch ID 2.
As for this year, Apple burned the Touch ID boat behind them to make sure everyone on every team had no choice, and no fallback, but to make Face ID work. But even Face ID isn't the answer. It's one of many possible answers. It's simply the one Apple could ship at sufficient scale right now.
Just like Touch ID was faster and more convenient than passcode and Touch ID 2 was so fast it barely felt like authentication, Face ID is almost transparent. Most of the time, your phone unlocks or your app authenticates and you're left staring at the fading animation, just beginning to realize you've been identified, when you're already in.
There are a few times, though, when your finger moisture has changed or you're wearing gloves, or your face is at an odd angle or you're all bundled up, where "it just works" just stops working. It's not often and it's not a lot, but it's enough to shatter the illusion. It's enough to make you want something even better and more transparent than Touch ID or Face ID.
Something passive... and persistent.
The future of authentication
Imagine a future iPhone where authentication doesn't require a specific fingerprint or facial geometry scan, or biometric challenge/response. But, instead, it was continuously grabbing snippets of biometric and other data. And imagine it would use that data to maintain a state of "trust" where your iPhone is simply unlocked for as long as it can be reasonably (or strictly, depending on settings) certain it's in your possession, challenging only when that state becomes uncertain.
Other vendors are already incorporating Touch ID-like sensors into the capacitive displays, rather than a discrete capacitive home button. There are also patents for microLCD technology that further enhances screen-as-fingerprint reading. In the future, certain areas — or even the entire iPhone display — could be able to pull at least partial fingerprint data every time you touched it.
Face ID is already doing full-face geometry scans with neural engine processing to unlock iPhone X. It seems almost trivial that the TrueDepth camera could grab at least partial facial geometry each and every time you looked at a screen.
Siri began doing the basics of Voice ID a couple years ago. Now, when you use setup buddy on a new device, it has you say a few simple phrases so it can distinguish your voice — and your voice queries and commands — from those of others. I don't believe it's robust enough for authentication yet, but companies like Nuance have been offering just those kinds of "my voice is my passport, authorize me" services for a while. It's not tough to see Apple using the multiple, beam-forming mics on iPhones and AirPods to constantly check for your voice either.
Apple's A-series processors also contain M-series sensor fusion hubs. Right now that's used for things like health and fitness apps and games. Taken further, though, gait-analysis could be used to record and check your walking and motion patterns, so as you move around your iPhone can know it's you that's doing the moving.
Biometric data could also be supplemented by other factors, like trusted objects. Previously, trusted objects were dumb — grab someone's dongle and you got into their phone. With Apple Watch, though, trusted objects got smarter. Auto Unlock on macOS, which uses the proximity of your Apple Watch to authenticate you for your Mac, feels downright magical. You authenticate on the watch via passcode or Touch ID on iPhone, then that authentication is further projected from Watch to Mac.
So could environmental data. For example, if you're in a certain place at a certain time that fits your existing patterns, that could add to the trust weighting.
Taken separately, each of these authentication methods either requires user action or doesn't provide enough security to be useful. Taken together though, every touch of the display provides a partial print, every glance at the camera provides a partial face or iris scan, every word a partial voice print, every step a partial gait analysis, and if a paired Apple Watch is proximate and you're in a place, at a time, that fits your pattern, enough factors pass authentication and the moment your iPhone senses any engagement, it's already unlocked and ready to be of service.
Conversely, any time enough factors fail authentication, your phone goes into lockdown and challenges for a proper fingerprint, iris scan, or passcode/password to make sure you're really you. And it could escalate for situations that warrant it. That's what happens today, for example, after a reboot, timeouts, software updates, etc. For secure enterprise or government use, it could do so more often and require multiple factors to resume a trusted state.
Not if, when
We'll need considerable advances in battery chemistry and strict adherence to privacy policies to enable this kind of technology, but Apple is uniquely positioned to deliver both. Just like chipsets, they don't have to worry about acting like a battery vendor, and unlike data harvesting companies, they don't want or need any of the personal information this surfaces.
To me, arguing about whether or not Touch ID or Face ID are better or if Touch ID is coming back misses the point. Touch ID isn't there for Touch ID's sake. Face ID isn't there for Face ID's sake. Both are solutions to the same problem and, in the future, there will either be still better, faster, and easier ways to solve that problem. Or simply make it disappear so it no longer needs solving.
Historically, that seems like the approach Apple takes. And that's why I think it's not about whether we see passive, persistent authentication — but when.