Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on October 14, 2020

Wait, what’s a LiDAR sensor and why’s it on the iPhone 12 Pro?


Wait, what’s a LiDAR sensor and why’s it on the iPhone 12 Pro?

If you’ve read about autonomous cars, you might know that LiDAR sensors (Light Detection and Ranging) play a huge part in the self-driving part of the car. The job of a LiDAR sensor is to ‘map’ an area surrounding it by measuring distances of objects via reflecting light rays.

Apple introduced the sensor in its high-end iPhone 12 Pro and Pro Max flagships to enhance photography. This is not the first time the company is using LiDAR sensor in its products. In March 2020, the Cupertino-based company included the sensor in its new iPad. However, the primary purpose of the sensor in that device was to aid augmented reality.

With the new iPhone, Apple claims that it’ll help you with three things: photo and video effects, precise placement of AR objects, and object and room scanning. The last two applications might be largely for developers and businesses like construction that need to map the room. For an average consumer, photo and video effects might be the most interesting part.

Because LiDAR can work in the dark by shooting lasers to calculate distances, Apple’s using it to improve the autofocus on the high-end iPhones. The company claims that because of the sensor the iPhone 12 Pro phones has 6 times faster autofocus in low light as compared to a normal camera. Plus, the LiDAR sensor enables the device to take better portrait mode photos in low-light conditions.

LiDAR scanner on the iPhone 12 Pro models

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The concept of using a dedicated depth sensor is not new. Plenty of mid-range phones such as the OnePlus Nord use a depth sensor camera to enhance portrait mode photos. But these sensors work better in the daylight than low-light. Some devices such as the Huawei P30 Pro and the Samsung Galaxy S20 Ultra have a time-of-flight (ToF) sensor. It uses infrared rays to maps the surroundings. You can read about the ToF sensor in our explainer here.

While both ToF and LiDAR can only scan the area up to a few meters in phones, some variants of the latter can measure distances more than 100 meters. However, those are used primarily on top of cars. The advantage of having a LiDAR sensor is that it sends smaller pulse rays from one corner to another to ‘scan’ the area. On the other hand, ToF sensor sends a single flash pulse to measure the area, and that can cause more accuracies when distances are calculated.

Having a LiDAR sensor on a phone will encourage more developers to probably find better uses cases for AR that are more convincing than one-time amusement demos. Snapchat has already confirmed that it’ll ship LiDAR-powered lenses on its iOS app.

Apple might not be able to convince you to buy its high-end modes just on the basis of the LiDAR sensor. However, the company hopes that if you’re an AR developer or someone who cares for quality low-light photos, it could one of the reasons for you to spend a few more dollars.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with