Skip to main content

Apple has just shown the world how its Smart Glasses will likely work

Focals North Glasses Lory Side Fake Photoshop
Focals North Glasses Lory Side Fake Photoshop (Image credit: Rene Ritchie / iMore)

It's been clear for some time now that Apple's plans for Augmented Reality are far greater than being able to look through your phone as though it were a window into another world. Apple Glasses have been rumored and speculated upon for years now, but when you ask those people how these glasses would work and still be something people would want to wear on their faces, the answers are typically rather uninspiring.

With the launch of the latest iPad Pro, the biggest missing puzzle piece has fallen into place, and the path to smart glasses with the Apple logo on the side seems much closer to being a real thing than it did a week ago.

Privacy first is necessary, but makes AR complicated

Georgia Dow wearing Google Glass

Georgia Dow wearing Google Glass (Image credit: iMore)

Augmented Reality was, by necessity, what happens when you take a snapshot of the real world and overlay virtual opjects. And right now, if you want to do this well, you need a combination of technologies we currently are fine with having on our phones. Accelerometers and Gyroscopes help position you within the augmentations and tells the environment how you are moving around inside it. Spatial audio and direcitonal speakers help you feel like you're really interacting with a virtual environment, and cameras help establish depth which creates the illusion needed to make things feel real for you as the user.

If Apple is actually making AR glasses, there can't be a camera onboard.

That last one, the camera? Yeah, we've already seen how well that works out when you have a camera on your face. Google Glass raised so many privacy concerns the company shifted the project out of the consumer space entirely, and lesser implementations like Snapchat Spectacles have never been seen as more than a gimmick. Future iterations of smart glasses, like Focals by North, ditched the camera entirely and opted to exist as more of a Heads-Up Display instead of proper AR glasses. Apple clearly isn't going to go that route, with all of the resources it has put into helping developers create compelling AR experiences on phones and tablets so far.

If Apple is actually making AR glasses, there can't be a camera onboard. Developers need to be able to create these compelling AR experiences on your face without the camera in order to guarantee a level of privacy that other glasses haven't been able to deliver on to date. And the answer to how this is accomplished lives in its newest iPad Pro.

LiDAR Scanner doesn't just replace cameras, it improves on the experience

(opens in new tab)

A photo posted by on

Camera-based augmented reality is seen as a necessity in phones and tablets because you need to be able to see the real world while you play with the virtual one. The overlay is necessary while your phone is the window, because your phone isn't transparent.

With glasses, you don't actually need a camera if you have something like Apple's LiDAR Scanner tech. The depth-map created by LiDAR is not only more accurate than what you get from even high end cameras like the ones in the iPhone 11 Pro, but it's way faster. Check out this video from Apple's Dev partners and see how quickly that AR experience starts up.

It's nearly instant! Compare that to when you start up AR mode in Pokemon Go, which not only takes a moment to load but requires you to wave your phone in a circle so the software can choose a fixed point to create a depth map. LiDAR also uses less power than your camera in AR mode, so it would be easier to deploy on something you're supposed to be wearing on your face with limited space for battery life.

Apple Glasses soon? Probably not.

This seems like a huge step forward, but it's not yet perfect. Don't expect Apple to "One More Thing" its glasses this Fall or anything. The current implementation of LiDAR on the new iPad Pro is clever, but limited. The depth map only works at about a five foot range, which is great for when you're sitting at a desk or driving in a car but if you're out walking in the world you probably want ten feet of interactivity with the virtual world as a baseline.

It's possible Apple could offset this with other tech from your phone, like GPS coordinates or some other positional data, but overall the AR tech on your face needs to at least be able to see the ground in front of you when you're standing and that's not currently a guarantee for everyone.

But it finally feels like we're getting closer to this being a real thing instead of a rumor, and that's very exciting. And if you want to be in on the next wave of AR from Apple, the new iPad Pro is going to be where all the action is for a while.

Russell is a Contributing Editor at iMore. He's a passionate futurist whose trusty iPad mini is never far from reach. You can usually find him chasing the next tech trend, much to the pain of his wallet. Reach out on Twitter!

2 Comments
  • The lidar scanner works out to 5 meters not 5 feet.
  • Seems like it almost has to have a camera. Probably shouldn't be allowed to store pictures. But there are so many AR applications that wouldn't work without a camera (imagine walking through a garden and seeing labels on the flowers, for an easy example).