While there are many changes coming to the next iPads Pro from Apple, you only need to look at it for a moment to see one of the more obvious changes. The cameras on this new tablet series are different, and not just from the previous generation. These new iPads are packing tech you won't find in the latest iPhone either, which is why the camera cut-out looks like nothing else you've seen from Apple before.
A big part of this change is the inclusion of a new sensor, called a LiDAR Scanner. It sits alongside the camera array to help make a handful of things more useful. Need to know more? Keep reading!
What is LiDAR?
You might have heard of SONAR and RADAR, so it might seem like LiDAR is another one of those detection mechanisms. While it's true there systems like self-driving cars use RADAR and LiDAR together, these technologies function very differently from one another. LiDAR uses light to measure distance, specifically green spectrum lasers.
When a LiDAR Scanner is used, it sends multiple pulses of nearly invisible green light per second in one direction. The purpose of these pulses is to measure how long it takes for the light to get back to the sensor, and to do this thousands of times every minute to offer a "picture" of the environment in front of the scanner.
In larger deployments, like cars and airplanes, LiDAR systems can offer a living snapshot of things at multiple depths in order to offer a computer a similar perception of depth we have in human eyes. This makes it possible for the computer to "see" far enough away that it's safe to use while the vehicle is in motion, which is great. But it's efficacy is determined by power output and size, so the smaller the sensor the shorter its effective distance of measurement.
Why do I want LiDAR in my iPad Pro?
Like USB-C in the previous generation and stereo speakers in 2015, Apple's iPad Pro line is a testing ground for new tech that's useful for professionals. LiDAR is incredibly useful for developers who want to build for the future. In particular, Apple's push into Augmented Reality over the last couple of years has needed specialist hardware to increase accuracy.
Developers want to be able to quickly map a table and build an environment over it, but as you can see in the AR Plus mode in Pokemon Go, building an AR environment on the fly still requires a moment and action on behalf of the user to set everything up. Based on what Apple has promised with more accurate measurement tools in its iPad Pro demo app, it should be possible to simply open an app and have more accurate depth tools that start working instantly. No more waving your device around to detect a surface!
The bottom line? If you want to live on the bleeding edge of AR development, this new feature is going to be a very cool thing to have while developers explore what is possible over the next couple of months. If you're not as excited by this, you're probably fine waiting until this tech comes to your iPhone.
Or, who knows, maybe a slick set of glasses?
We may earn a commission for purchases using our links. Learn more.