Depth Mapping Tech – Now Integrated into Smartphones

Depth Mapping Tech – Now Integrated into Smartphones

deep mapping

This post is also available in: heעברית (Hebrew)

A technology that has been used by military forces for years in range sensing systems has been gaining momentum in the smartphone devices market. More and more smartphone manufacturers are starting to put improved 3d mapping time-of-flight (ToF) cameras into the devices.

Time-of-flight cameras are comprised of a sensor that uses a tiny laser to fire out infrared light. This light bounces off anything or anyone in front of the camera and back into the sensor. The length of time the light takes to bounce back is measured and that translates to distance information that can be used to create a depth map.

Up until now, most phones have relied on stereovision, which employed two cameras to calculate rough depth, but this method doesn’t work in low light or in the dark and it’s not very accurate.

A better method that also employs infrared is structured light illumination where a dot pattern is projected onto a scene or face and the sensor measures the distance between the dots and looks at the distortion in the pattern to calculate depth. This technology works well in the short range, up to arm’s length, for things like facial recognition, which is why Apple employed it with its TrueDepth Camera for Face ID, according to digitaltrends.com.

Unlike these cameras, the time-of-flight camera illuminates the scene with a homogenous flood of light and the camera looks at every individual pixel in the image. The sensor synchronizes with an incredibly sensitive clock that’s capable of measuring tiny variations revealed by the speed of light bouncing back. With depth information assigned to every pixel you get a rich depth map.

There are various different potential applications for accurate depth mapping like this, which is why it’s creeping into more phones.

The underlying technology isn’t new. In the gaming sphere, there was a time-of-flight camera in Microsoft’s Kinect sensor and the military has been using time-of-flight technology to get depth information for many years. But as Dr. Xavier Lafosse, commercial technology director of Corning Precision Glass Solutions, told digitaltrends.com, improvements in the technology have allowed integration of the required elements into ever smaller form factors, and new applications for it are driving its adoption in phones.

This technology is also vital for augmented reality or mixed reality wearables — like Microsoft’s HoloLens or Magic Leap — to work because these systems need a very accurate picture of your environment.

Time-of-flight cameras could also help in indoor navigation. If there’s a 3D map of your building in the cloud, then the sensor could potentially recognize precisely where you are at any given moment.

Mobile robots can build up a map of their surroundings very quickly, enabling them to avoid obstacles or follow a leading person. As the distance calculation is simple, only little computational power is used. Time-of-flight cameras are also used in assistance and safety functions for advanced automotive applications such as active pedestrian safety, precrash detection and more.