iPhone 12 Pro LiDAR scannerSource: Daniel Bader / Android Central

With iOS 14, Apple introduced a bevy of new accessibility features, including many that are useful to a wider audience, as well. Back Tap, for example. Coming soon to iOS 14.2, alongside iPhone 12 Pro, and soon iPhone 12 Pro Max, Apple is introducing another amazing new accessibility technology that makes fantastic use of the LiDAR scanner. It's called People Detection, and it is astounding. If you are an iPhone user with sight impairments, you're definitely going to want the iPhone 12 Pro over the iPhone 12 thanks to this LiDAR scanner technology..

Not only is this an amazing feature for the sight-impaired, but this could also be very useful for helping everyone be aware of whether they are properly socially distancing. With four different types of feedback to let you know how far away a person is, you can keep your distance even if someone is walking straight toward you.

VPN Deals: Lifetime license for $16, monthly plans at $1 & more

What is People Detection?

People Detection has to do with helping sight impaired iPhone users with proximity. Right now because of the pandemic and need for social distancing, proximity is even more important than ever before. When some sight-impaired people are navigating in public spaces, it can be difficult to determine just how far away someone else is, and whether a person is walking toward you or veering away from you.

People Detection uses People Occlusion in ARKit and the LiDAR scanner on the iPhone 12 Pro and iPhone 12 Pro Max, as well as the iPad Pro. The LiDAR scanner measures the distance from the device to objects within 15 feet, in real-time. Then combining this data with People Occlusion with ARKit to determine if there are any people in the field of view and then gives continuous feedback about how far away that person is.

Machine Learning on the iPhone will determine whether the object in the camera is a person or not a person. If it is a person, the LiDAR scanner determines how far away that person is (the nearest person in the field of view). Whether you're standing still or walking forward, you will receive feedback as to the distance between you and that person. It works statically and dynamically. Even if you are walking toward a person that is walking toward you, away from you, or at an angle from the right or left side of you.

How does People Detection work in the real world?

There are four ways to get feedback about the distance of the nearest person in the camera's view.

Audio readout - A voice readout of how far away the person is in either feet or meters.

Setting a threshold distance - You can set a threshold distance (6-feet, for example) and receive two different audio tones, depending on whether you are within the threshold or outside of it.

Haptics - Once a person is detected in the camera, it sends a slow, low haptic pulse. The closer the person gets, the faster the pulse gets.

Visual readout - A readout on the screen that notes how far away the nearest person is, where the person is in relation to your current direction.

You can use each of them individually or in combination with each other, and it works with VoiceOver.

If there are multiple people in the camera's view, it will identify the nearest person and will switch to a new person as soon as that person is no longer in the camera's view.

How can People Detection help with social distancing?

As with many accessibility devices, People Detection can potentially be useful in a broader way.

When you set a threshold distance of 6 feet, you can create a sort of geofence around you to let you know when you're getting too close to someone else and send you an alert that you are within 6 feet of them. If you are at all concerned about how close together your chair is to the dining couple next to you, you can check the distance with People Detection.

What other features does People Detection have?

If you're using AirPods Pro, you can take advantage of Spatial Audio to hear in either the left or the right ear when someone is off to the left or off to the right of you.

It works indoors and outdoors but doesn't work in pitch-black or extremely low-light conditions because of LiDAR's integration with Machine Learning to detect whether the object in the view is a person or not.

People Detection works with ... people. But Apple is opening up this technology to developers through ARKit, CoreML, and Machine Learning so anyone could create similar types of detection apps, like car detection.

When is People Detection available?

It ships with iOS 14.2, which is currently in beta. If you're already testing the iOS 14.2 release candidate, which just launched today, you can find it in the Accessibility settings of the Magnifier app.