Apple said the lidar sensor can measure the distance between users and objects by the time it takes for light to reflect. For app developers, the technology can accurately draw the depth map of the scene, accelerate ar Games / applications and bring a new experience. < / P > < p > in addition, combined with the iPhone 12’s machine learning function and related development framework, lidar can help the device better understand the real environment (scan indoor space and objects / enable photo and video effects), and iPhone 12 Pro can improve shooting performance in low light environment. < / P > < p > during the live broadcast, apple used snapchat as an example to demonstrate the unique filter that will be built for the iPhone 12 Pro Series. < / P > < p > in the new ar mode, users will be able to see the table and floor full of flowers and plants, and there are lifelike birds flying to you. Global Tech

By ibmwl