We've known for some time now that Apple is working on augmented reality glasses. Speculation on what these glasses will be capable of has varied wildly, from fully immersive experiences to little more than simple text on lenses. Prolific Apple leaker Jon Prosser has offered up some crystal clear details on these glasses, and when you put those details together it paints a fairly clear picture.
That picture may not be the cyberpunk fantasy headset so many seemed to think Apple was working on, but it's going to be impressive all the same.
Privacy first, unless you're in direct sunlight
According to Prosser, Apple Glass is going to look and feel very much like a "normal" set of glasses, with displays in both lenses coming together to form a display that floats in front of you. Prosser notes the version of the headset he saw was very early and made of plastic, but notes it's likely the finished product will be metal like an iPhone or Apple Watch. Charging will be done wirelessly through a custom dock, making it so you set the glasses upside-down in the cradle to charge. It's unclear if this will be an Apple Watch-like proprietary charger or something you'll be able to use with multiple chargers, but it's likely to be the latter instead of the former.
You won't be able to see what someone is doing with Glass just by looking at them from the other side.
You won't find a camera on this headset, which has been rumored for a while now. Apple's dedication to privacy and all of the concerns with the cameras baked into Google Glass and Snapchat Spectacles makes this pretty easy to believe, and if Apple were to make the camera on this anywhere near as good as one on its phones it would take up far more space than you have on a set of glasses like this. Instead, Apple's going to rely on LiDAR, which we'll touch on later.
One major detail we have on display Apple has chosen for these glasses is you won't be able to see what someone is doing with Glass just by looking at them from the other side. This reminds me of Focals by North, which similarly you can never really tell when they're in use. I'm a little concerned about how well it will work outdoors, though. Prosser noted Apple couldn't get its lenses to tint without having problems, which means there will likely be a sunglass attachment like we've seen with Focals by North. And like Focals, this could mean the display is easily washed out on a sunny day.
Notifications, navigation, and probably not much else for you to do
What will you actually be able to do with these Apple Glass? Processing will not be happening on the device, which means it will be first and foremost a heads-up display for your phone. Notifications will be able to appear in front of you, and you'll be able to interact with those notifications with simple gestures or with "Hey Siri" like you can on your Apple Watch.
Will the experience be just like the Apple Watch? That seems unlikely. Apple's come a long way since the original Apple Watch, which also did most of its processing on the iPhone, and it's learned quite a bit. Turn-by-turn navigation, either walking or driving, seems more than a little likely.
Starboard, the name of the UI for Apple Glass, is going to be Apple's biggest push into contextual computing yet. Apple Glass is going to suggest actions to you, rely heavily on voice and gesture instead of forcing you to pick up your phone, and communicate well with the rest of your Apple ecosystem. It's going to be less about what you can do with this new hardware, and more about how much easier Glass can make your day to day interactions with the rest of the world.
When Apple's LiDAR sensor made it to the 2020 iPad Pro, lots of folks thought it was going to be about depth data on the glasses. Fever dreams of being able to project an Augmented Reality game onto the coffee table in front of you and see it like a hologram projected by R2-D2 filled the world. But with no local processing and fairly limited displays, these glasses aren't quite ready to deliver such a futuristic experience.
All of this for $499? It seems too good to be true.
Instead, LiDAR is going to be able to deliver better gesture controls for Apple Glass. Instead of a touchpad for you to fondle when you want to drift through the UI, the LiDAR sensor will be able to "see" your hand in front of your glasses and let you make gestures to control the UI. It's not clear if the level of control will be similar to the bloom gesture in Microsoft's Hololens or if it will be more like simple hand waving, but the idea will be to wave your hands and make things happen on your Glass. It'll be interesting to see how this works in public spaces.
One final detail from Prosser, which got kind of glossed over in his video, "special" QR Codes from Apple that work with Glass. There's no camera on Glass, so how would this work? If the LiDAR sensor is good enough, the special thing about these QR Codes may be that they are 3D. Raised characters in places like retail outlets or sports centers could make it so Apple partners could deliver custom Glass experiences at these places just by having them walk in and look at the wall. That would make it something Apple Glass users could do that is unique and fairly painless, which will be good for everyone.
All of this for $499? It seems too good to be true, but Prosser's leaks rarely turn out to be terribly far off. I'm a fan of this Apple Glass future, especially if it's possible to use these glasses in unique ways without having someone nearby be concerned because you're wearing a computer on your face.
Here's hoping Apple feels comfortable enough with its current release cycle to share a preview with us this year.