How ToF and LiDAR Work together for AR?

Nov 18, 2021

Depth maps and point clouds are cool, and, for some people and applications, they’re enough. However, for most AR applications, this data has to be contextualized. Both ToF and LiDAR do this by working together with other sensors on the mobile device. Specifically, these platforms need to understand your phone’s orientation and movement.

Making sense of the device’s location within a mapped environment is called Simultaneous Localization and Mapping, or “SLaM.” SLaM is used for other applications like autonomous vehicles, but it is most necessary for mobile-based AR applications to place digital objects in the physical environment.

This is particularly true for experiences that remain in place when the user isn’t interacting with them and for placing digital objects that appear to be behind physical people and objects.

Another important factor in the placing of digital objects in both LiDAR and ToF-based applications involves “anchors.” Anchors are digital points in the physical world to which digital objects are “attached.”

In world-scale applications like Pokemon Go, this is done through a separate process called “Geotagging.” However, in mobile-based AR applications, the digital object is anchored to points in a LiDAR point cloud or one of the feature points on a depth map.

Related Posts