Lidar sensors in devices such as the iPhone 12 Pro can be used to help autofocus. This focus assist is particularly useful in low light environments, and apple has used it to help design Apple glasses. Combined with radar, as well as lidar now available in iPad pro and iPhone 12 pro, & quot; Apple Glasses & quot; can sense the wearer’s surroundings when the light is too dim to see clearly. < / P > < p > & quot; head mounted display with low light operation & quot; is a newly disclosed Apple patent application, which describes a variety of methods for perceiving head display (HMD) wearers to detect their surroundings. Apple said in its patent application that the human eye has different sensitivity under different light conditions. In the condition of insufficient light, such as relying on photosensitive vision, people’s ability to observe the environment will be worse than that under sufficient light conditions. < p > < p > Apple’s solution is to use sensors in HMD such as & quot; Apple Glasses & quot; to record the surrounding environment. The results are then sent back to the wearer as unspecified graphic content. The key of recording environment is to sense the distance between objects, that is, the ability to detect depth. The depth sensor detects the environment, especially the depth (such as distance) from the object in the environment. The depth sensor generally includes an illuminator and a detector. The illuminator emits electromagnetic radiation (such as infrared light) to the environment, and the detector observes the electromagnetic radiation reflected by the objects in the environment. &Apple is careful not to limit the possible detectors or methods described in its patent application, but it does provide specific examples;. One of them is to use time of flight and structured light to project a known pattern into the environment, the time required to identify the pattern and give the depth details. In other examples, the depth sensor may be a radar detection and ranging sensor (radar) or an optical detection and ranging sensor (LIDAR), and it should be noted that one or more types of depth sensors can be used, for example, in combination with one or more of structured light sensors, time of flight cameras, radar sensors and / or lidar sensors. < p > < p > Apple head display can also use & quot; ultrasound & quot; but no matter what the headset produces, the purpose of patent application is to accurately measure the surrounding environment and transmit this information to the wearer. Apple said the controller determines the graphic content based on one or more of the infrared sensors or depth sensors and the ultrasonic sensor’s sensing of the environment, and operates the display screen to provide graphic content that goes with the sensing environment. The release and download schedule of Microsoft Flight Simulation varies from region to region