Skip to main content

Apple details new AR experiences with Location Anchors in ARKit 4

Arkit
Arkit (Image credit: Apple)

What you need to know

  • Apple has detailed some of the changes coming to ARKit 4.
  • It shows how developers can use the new framework to create new AR experiences for iPhone and iPad users.
  • One feature includes AR experiences anchored to a location.

Apple has detailed its changes to ARKit 4 in a new developer news release (opens in new tab), including a new Location Anchors feature.

Apple announced ARKit as one of its new developer frameworks at WWDC 2020 this week. From the release:

ARKit 4 on iPadOS introduces a brand-new Depth API, creating a new way to access the detailed depth information gathered by the LiDAR Scanner on iPad Pro. Location Anchoring leverages the higher resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps. And support for face tracking extends to all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.

Depth API will allow the LiDAR scanner to generate fast, realistic information about your surroundings for new features like taking measurements.

The coolest new feature is arguably Location Anchors:

Place AR experiences at specific places, such as throughout cities and alongside famous landmarks. Location Anchoring allows you to anchor your AR creations at specific latitude, longitude, and altitude coordinates. Users can move around virtual objects and see them from different perspectives, exactly as real objects are seen through a camera lens.

Apple Maps will use a localization map (so no location data goes to Apple) which could give users access to AR experiences based on their geographical experience, for example, developers could apply AR labels to buildings to help with navigation or sightseeing for tourists.

The final detailed feature is Expanded Face Tracking Support, which has been extended to any front-facing camera on a device with the A12 Bionic chip or later, including the iPhone SE. Developers can track up to three faces at once with the TrueDepth camera. You can read the full release, including a list of all of ARKit 4's new features here. (opens in new tab)

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design.

Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple.