With iPhone X and the TrueDepth camera, Apple is introducing two very different systems: Face ID, which handles biometric authentication, and face tracking for ARKit, which lets augmented reality apps mimic your facial expressions. The two are, internally, completely separate But since the TrueDepth camera powers both, there's been some confusion and concern over how Apple's handling biometric face data and what, if any, access to it developers might have. Let's clear that up.
What is Face ID and how does it work?
Face ID is similar to Touch ID. Both are biometric identity systems that let you more quickly and conveniently unlock your iPhone and authenticate transactions. Where Touch ID uses your fingerprint as captured by the sensor in the Home button, Face ID uses your face data as captured by the TrueDepth camera on iPhone X.
From my Face ID explainer:
Once you've registered [your face] with Face ID, and you go to unlock, here's what happens:
- Attention detection makes sure your eyes are open and you're actively and deliberately looking at your device. This is to help avoid unintentional unlock. (It can be disabled for accessibility if desired.)
- The flood illuminator makes sure there's enough infrared light to "see" your face, even in the dark.
- The dot projector creates a contrasting matrix of over 30,000 points.
- To counter both digital and physical spoofing attacks, a device-specific pattern is also projected.
- The True Depth camera reads the data and captures a randomized sequence of 2D images and depth maps which are then digitally signed and send to the Secure Enclave for comparison. (Randomized to again counter spoofing attacks.)
- The portion of the Neural Engine inside the Secure Enclave converts the captured data into math and the secure Face ID neural networks compare it with the math from the registered face.
- If the math matches, a "yes" token is released and you're on your way. If it doesn't, you need to try again, fall back to passcode, or stay locked out of the device.
For developers, it works like Touch ID:
What developers can get isn't face data but face tracking — through ARKit.
What is face tracking in ARKit and how does it work?
ARKit is Apple's framework for augmented reality. It handles everything from plane detection to lighting and scaling. Developers have already gotten ARKit apps to do things like lipstick and makeup previewing, but with the TrueDepth camera on iPhone X, much more specific support is possible.
Here's how it works:
- The app asks permission to access the camera (if you're using it for the first time).
- The TrueDepth camera creates a coarse 3D mesh, matching the size, shape, and topology, position, and orientation of your face, and your current facial expression.
- ARKit provides that information to the app.
At no point does the app (or developers) communicate at all with the Secure Enclave or get any of the Face ID biometric data stored therein.
In other words, the app knows there's a face and what it's doing but it has no idea whose face it is and gets none of the precise details Face ID matches against.
What ARKit gets that Face ID doesn't is anchor points in 3D space. So, apps can attach funny eyebrows and keep them attached as you move around. That's it.
Just like an app can tell where, when, and how you're touching the display, but can't identify your fingerprints, ARKit can tell how you're looking at the TrueDepth camera, but only so far as to map your movements and expressions to a poop emoji.
Making Face Matching privacy even more granular
One thing I would like to see in future versions of iOS is separate privacy settings for face matching. Asking for camera access if fine for an app that only wants camera access for ARKit face tracking, since you can grant or revoke it at any time to precisely control the tracking.
For apps that might want camera access for more than just ARKit face tracking, though, it's an all-or-nothing equation. Either you get all the features or none. You can't pick or choose just the ones you're comfortable with.
A discreet setting for Face tracking would be both more transparent and more flexible for everyone.
Any face-based questions?
New technology is always confusing and it's good to be cautious. Some people still tape over the selfie cams on their phones and laptops as a matter of course. In the end, it's up to each individual to learn as much as possible and then make the best decision between security and convenience for them.
I'm rather paranoid by nature but, based on everything I've seen to date, I'm confident there's no way for developers or anyone else to get at my biometric face data with Touch ID or ARKit, just like they haven't been able to get to my biometric fingerprint data with Touch ID or multitouch.
But the more tests and the more questions, the better. So keep 'em coming!
*Originally published September 27, 2017. Updated November 30, 2017, with a proposal for separate ARKit privacy settings.
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
Get the best of iMore in in your inbox, every day!
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.