The iPhone 12 Pro camera will use Sony’s lidar technology on the iPhone Pro model, and the rear depth camera will enhance autofocus, improve portrait mode and make AR applications more realistic. This year’s iPhone will get the same features as the Sony lidar depth camera installed in Apple’s 2020 iPad Pro released in March, people familiar with the matter said. < / P > < p > after years of planning features, the technology will appear in Apple’s top “pro” iPhone, which is due to be announced this fall, sources said. The lidar system, made by Sony, uses light pulses to accurately measure the distance from an object to the camera lens. With this data, the camera can autofocus more accurately and better distinguish the foreground from the background to create effects such as portrait mode. Depth cameras will also help AR applications more realistically place digital objects in the real world. < / P > < p > since the iPhone X in 2017, the high-end iPhone has another type of front-end 3D camera system, brand name truedepth, used for face ID authentication and animoji. The system projects 30000 infrared points on the surface of the user’s face to form a 3D map of its outline. The system can provide high accuracy in a distance of less than one meter, so it is very suitable for measuring in the arm length range of mobile phone. < / P > < p > this precise distance data is added to the visual data collected by other camera sensors on the phone, so that the camera can better understand the object in front of it. For example, it can make the objects in the foreground appear clear and different from the background features, and the human eye will naturally feel that the background is a little fuzzy and fuzzy. Refining these nuances can help iPhone’s autofocus feature create more realistic photos closer to those taken by traditional cameras with full-size lenses. < / P > < p > the lidar system will also improve the AR experience of mobile phones. Depth data, as well as other data from other camera sensors and motion sensors, will help ar software place digital images more accurately in the real image layer seen through the camera lens. Lidar systems can also bring higher accuracy to tasks such as measuring space, objects or people. For example, medical rehabilitation AR applications may use depth data to more accurately measure the range of motion of a patient’s arm. < / P > < p > it is worth mentioning that new products are likely to be later than Apple’s usual autumn plan, as the coronavirus slows down the efficiency of Apple’s parts suppliers. Luca Maestri, Apple’s chief financial officer, told the company’s third quarter earnings call that the new iPhone would be available “within weeks” after the launch. Apple usually has autumn activities in the first or second week of September, but this year it could happen almost any time in September or October. The event will certainly be online virtual and may follow the strange information business style used by the company in WWDC presentations. Chinese version of K-car: reading a10e design drawing exposure