The camera-tech on modern smartphones has changed and evolved significantly over the years, resulting in the current state of photography that allows you to capture images that are on par (if not better) with the standard camera. In recent years, we have seen a wide variety of tech, from the dual-camera setup to the 64MP primary shooter, to the wide-angle lens, and whatnot. And eventually, we have seen the smartphone camera improving and growing to become the preferred choice for most people in day-to-day scenarios. Among these features, one of the hottest trends, currently taking people by surprise, is the ToF (Time of Flight) sensor found on some of the newer smartphones. So, what is a ToF camera, and how does it work, follow along to learn more.

explained: time of flight cameras (tof) - huawei p30 pro 1

What is a Time-of-Flight (ToF) camera?

At a ground level, a Time-of-Flight (ToF) camera is much like any other camera. Except that, it comes with a depth sensor that allows it to create a three-dimensional image of the surroundings with a well-defined depth of field. You can think of a ToF sensor as SONAR (Sound Navigation and Ranging), with the difference being the use of IR light, instead of sound, to create a three-dimensional representation of what appears on its field of view. Using which, it can detect things and measure distances more precisely. And in turn, capture much sharper photos.

A ToF camera employs time-of-flight (ToF) range imaging techniques to determine the distance between the camera and the subject for each point of the image. For the uninitiated, ToF is the measurement of the time taken by an object or a wave to traverse a specified distance through a medium, and further, using the resultant information to learn more about different properties of the object or medium.

explained: time of flight cameras (tof) - samsung s10 rear camera

Depending on the manufacturer implementing the ToF camera on its smartphones, the name used can be anything from a range sensor to 3D sensor to depth vision sensor. Take Samsung for example; the company is marketing its equivalent for the same on the newest S10 lineup by the name, DepthVision lens. However, besides its name, the camera uses the same technology to capture detailed images with well-defined bokeh.

Besides its use in smartphones, ToF sensors also find applications in the field of object identification, gesture recognition, autonomous driving, Augmented Reality (AR)/Virtual Reality (VR), gaming consoles, and much more.

How does a Time-of-Flight (ToF) camera work?

explained: time of flight cameras (tof) - tof camera

The time-of-flight (ToF) technology isn’t new. It has been around for a couple of decades now, undergoing rigorous experimentations to make it efficient and cost-effective that it has become today. A conventional ToF camera uses LIDAR (Light Imaging Detection and Ranging or Light Detection and Ranging) — a surveying method that uses light to measure the distance to a target by illuminating it and measuring the reflected light with the help of a sensor. Essentially, measuring the difference in return times and wavelengths of the reflected light to determine the distance to the object.

With the ToF camera on modern smartphones, the same principle is applied to capture the depth information for all individual pixels to extract a three-dimensional depth map, which helps in producing an image with a well-defined depth of field that has a perfectly differentiated foreground and background. And this, in turn, allows the camera to produce a somewhat precise portrait or bokeh photo. ‘Somewhat’ – because, even though the sensor does most of the work in creating a three-dimensional depth map, there is still a lot that needs to be done on the software side to achieve the perfect portrait or bokeh shot that everyone admires.

So every time you pull out your smartphone to capture a photo, the camera shoots out a pulse of the IR light in its field of view, which, depending upon its surroundings, reflects-off of objects located at varying distances from the camera. As a result, light reflecting from nearby objects enters the camera sensor before the light that strikes off far away objects. Based on which, the smartphone measures the time that each ray of light takes to traverse its path. And, after performing some calculations, creates a detailed three-dimensional depth map, which is further used by the smartphone’s camera software to fuse with the conventional image and get the perfect bokeh or portrait shot.

A thing to note here with the entire process is that, despite the high-processing power and resources that go into crunching those calculations and stitching the depth map with its image, the time required for processing is little to none. And a large part for this has to do with the ToF camera, which is responsible for carrying out the operations efficiently and independently, without demanding excessive processing power from the smartphone.

As a result, we are starting to see more smartphone manufacturers adopting the ToF camera on both the front and the rear side. Similar to how the smartphone would use the ToF camera to capture portrait shots on the rear, the same can be used on the front to capture portrait selfies. Not to mention, face unlock mechanism, which uses the front camera to authenticate users on the device, can also benefit from the ToF camera. And, can generate more detailed face mapping to enhance security.

Applications of a Time-of-Flight (ToF) camera and its future?

explained: time of flight cameras (tof) - xbox tof camera

Apart from its applications in smartphones, ToF cameras also find their place in fields like AR/VR, biometric authentication, gesture-based navigation, autonomous vehicles, etc. Besides being used to capture photos on a smartphone, manufacturers are starting to use the ToF sensor for biometric authentication. With the help of which, they first create a detailed facial map of the user’s face and use it later when a user tries to authenticate themselves on the smartphone. Similarly, Kinect from Microsoft’s Xbox uses the ToF sensor for gesture detection and facial authentication. Not to mention, some autonomous cars, which use LIDAR (Light Detection and Ranging) to see and track nearby objects, and navigate accordingly. However, even though the price for manufacturing a ToF camera has gone down considerably over the years, it is still not cost-effective in self-driving cars which demand multiple cameras to see the surroundings effectively to navigate better.

Which smartphones have a Time-of-Flight (ToF) camera?

explained: time of flight cameras (tof) - huawei p30 pro

Time-of-Flight (ToF) cameras have started to appear on smartphones very recently, thanks to the tech that has evolved into something that is more readily available and cost-efficient. Some of the smartphones that come with a ToF sensor include the Samsung Galaxy S10 5G, LG G8 ThinQ, Huawei P30 Pro, Honor View 20, and Sony Xperia XZ4, to name a few. Apparently, this year’s iPhone 11 models are also rumored to come with a TrueDepth system powered by a ToF camera, that will be used to improve the AR experience, achieve more accurate 3D rendering, get better portrait shots, and also improve FaceID.

With the current trend in the smartphone world that is more inclined towards photography, we can expect to see more smartphone manufacturers offering ToF cameras on their upcoming range of smartphones. And over time, see the feature making its way into the mainstream on even the lower-end smartphones.

Was this article helpful?