The iPhone 12 Pro is packing a LiDAR sensor – but what is it and how does it work?
The iPhone 12’s camera specs have finally been unveiled – and we now know that the iPhone 12 Pro range is going to be using the new LiDAR scanner on the back. That’s right, the same mysterious dot that first appeared on the iPad Pro 2020. But what is a LiDAR scanner? A built-in lie detector? A more relaxed version of radar perhaps? As we’ll discover, LiDAR (or ‘Light Detection and Ranging’) does work in a similar way to radar, only it uses lasers to judge distances and depth. This is big news for augmented reality (AR) and, to a lesser extent, photography too. The more interesting question, though, is what LiDAR will let us do on the iPhone 12 Pro. Using our experience of seeing the tech on the iPad Pro 2020, we’ll can explore the kind of experiences LiDAR could open up on the new iPhones – and, ultimately, the Apple Glasses. But first, a quick rewind to the tech’s origins, so you can sound smart during your next family Zoom meeting… The concept behind LiDAR has been around since the 1960s. In short, the tech lets you scan and map your environment by firing out laser beams, then timing how quickly they return. A bit like how bats ‘see’ with sound waves, only with lasers –which makes it even cooler than Batman’s Batarang. Like most futuristic tech, it started life as a military tool on planes, before becoming better known as the system that allowed the Apollo 15 mission to map the surface of the moon. More recently, LiDAR (also known as lidar) has been seen on self-driving cars, where it helps detect objects like cyclists and pedestrians. You might have also unwittingly come across the tech in your robot vacuum. But it’s in the past couple of years that LiDAR’s possibilities have really opened up. With the systems getting smaller, cheaper and more accurate, they’ve started become viable additions to mobile devices that already have things like powerful processors and GPS – tablets and phones. Of course, not all LiDAR systems are created equal. Until fairly recently, the most common types built 3D maps of their environments by physically sweeping around in a similar way to a radar dish. This obviously won’t cut it on mobile devices, so newer LiDAR systems – including the 3D time-of-flight (ToF) sensors seen on many smartphones – are solid-state affairs with no moving parts. But what’s the difference between a time-of-flight sensor and the LiDAR ‘scanner’ that we’ll mostly likely see on the iPhone 12? You might already be familiar with the time-of-flight (ToF) sensors seen on many Android phones – these help them sense scene depth and mimic the bokeh effects of larger cameras.
Home
United States
USA — software What is a LiDAR scanner, the iPhone 12 Pro's camera upgrade, anyway?