Apple on Wednesday unveiled two new iPad Pro models that come equipped with a LiDAR Scanner, which will offer major improvements to ARKit and photography.
Apple's new LiDAR Scanner will offer major improvements to augmented reality.
The new 11- and 12.9-inch iPad Pro models are the first of Apple's devices to feature the 3D laser system, but they likely won't be the last. Here's what you need to know about LiDAR, how it improves current iPad Pro models, and what other future Apple devices could feature it.
At the most basic level, LiDAR is a time-of-flight system that shoots low-power lasers at an environment. Using the reflections, it calculates the distance to objects and points in the environment, and creates an accurate 3D depth map or rendering based on the results.
Apple's own proprietary take on it, simply dubbed the LiDAR Scanner, likely has a few more tricks up its sleeve. Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
The iPad Pro LiDAR Scanner is used to create depth mapping points that, when combined with camera and motion sensor data, can create a "more detailed understanding of a scene" according to Apple.
Both first- and third-party apps will be able to take advantage of more accurate depth mapping.
Among Apple's existing features, LiDAR will have the biggest impact on augmented reality (AR) and Apple's own ARKit framework. Apple says the new LiDAR Scanner will allow for instant object placement, indicating that users wouldn't need to "scan" their environment before an AR app loads.
Along with improvements to motion capture and people occlusion, the LiDAR Scanner will also make the Measure app much faster and more accurate. Measure will now include a new Ruler View for more granular measurements going forward, too.
While Apple didn't specifically mention it, LiDAR will improve photography too. Take Portrait Mode, which the 2018 iPad Pro only supported in front-facing mode. With an actual 3D depth map of an area instead of using lens-based calculations to determine depth, Apple could add rear-facing Portrait Mode to the iPad Pro and improve the feature's accuracy and speed.
Apple's LiDAR scanner has launched first on the new 11- and 12.9-inch iPad Pro, as was previously rumored. But the system is also largely expected to arrive on some 2020 iPhones, too.
The latest information, pulled from code within an iOS 14 leak, suggests that a time-of-flight camera will arrive on both the "iPhone 12 Pro" and the "iPhone 12 Pro Max" this year.
On those devices, a LiDAR Scanner will also bring the same improvements to ARKit apps and photography. But combined with Ultra Wide Band technology, it may also be useful in applications such as indoor navigation and item tracking.
LiDAR is a new addition to Apple's handheld devices, but the Cupertino tech company has actually been actively using them for years in other applications. Apple vehicles with LiDAR sensors have been spotted in California as far back as 2015. The technology is considered a crucial part of the development of autonomous vehicles, particularly so they can accurately analyze their environment.
Amid rumors of Project Titan and the "Apple Car," the company appears to be steadily investing into LiDAR and other related research for vehicular applications, including a slew of patent applications related to the tech.
And in a rare public-facing example of its research, Apple also published a research paper in 2017 detailing LiDAR-based 3D object recognition systems for self-driving cars. Essentially, the system leverages the depth mapping of LiDAR and combines it with neural networks to vastly improve the ability of a self-driving car to "see" its environment and potential hazards.
Source: Appleinsider