It didn't get any stage time during the big Apple keynote at WWDC, but in my opinion one of the coolest things coming to iOS 12 is eye tracking through ARKit 2. Apple is allowing developers to use the TrueDepth camera on the iPhone X to determine where your eyes are looking on the screen.

Within hours of the iOS 12 developer beta being announced, several incredible examples of what this tech is capable of had found its way into demo apps.

The accuracy of this simple test is nothing short of incredible, and because of the way Apple's TrueDepth camera works this tech will operate just as good or better in dark rooms than it will in broad daylight - which can't be said of eye tracking systems found on Android. There are some incredible implications here for Assistive tech as well, enabling people who aren't always able to use their hands to navigate iOS to simple gaze and blink to use an app.

Regardless of platform, eye tracking introduces a ton of privacy concerns.

On the surface, this is an all-around win for consumers. In fact, after seeing the demos I found myself scratching my head wondering why Apple didn't make this tech the star of the WWDC keynote. Outside of it being currently limited to just the iPhone X, this is in many ways breakthrough technology without equal. But the more I thought about it, the more I realized Apple probably knows this technology is missing an important component for mainstream use - privacy.

Save 40% and get three months of wireless service for just $45

Regardless of platform, eye tracking introduces a ton of privacy concerns. A browser with eye tracking enabled introduces a ton of cool features, but it is infinitely more valuable to website owners and advertisers. If an advertiser can prove people are looking at ads on a page, it creates a whole new kind of impression tracking. Or worse, ad rolls that can tell you can't watching and refuse to dismiss until it has confirmed you were looking for a certain period of time. Being able to track how long someone looks at an image, or where they are most focused in a video, is all information that has a lot of value to people who aren't the person holding the iPhone. While there's an argument to be made for trusting the people who make your apps, Apple's constant push for a more secure operating environment for everyone insists on new permissions specifically for eye tracking.

ARKit apps have to ask you for permission to access the camera currently, but eye tracking goes above and beyond what most people think the camera on their phone is capable of. This technology is a door to a whole lot more information than whether you have your tongue out for MeMoji, and by design doesn't involve you even having the camera UI open to really understand what is happening on the computational end of the experience. For most people, eye tracking through ARKit 2 is going to look and feel like magic. Repeatedly through the evolution of modern tech, that jaw-dropped sensation tends to make the questions about how safe something is go away.

It's still early days for eye tracking on the iPhone. We know the iPhone X is the only phone in the current generation of devices even capable of offering the feature, and we know iOS 12 proper is still quite a ways away from being a something everyone has access to. That gives Apple some time to implement a a new kind of permission for eye tracking, and I genuinely believe we'll see that happen sooner rather than later. Ideally, it would look similar to the camera request permission but instead be phrased around the use of your gaze. If I know an app is about to use where I am looking on the screen as information, I am likely to think twice before enabling it. As excited as I am for the future of this particular form of augmented reality, my concern towards how the people on the other side of the screen will use that information has me deeply curious about Apple's approach moving forward.

iOS

Main