The addition will make future iPhones far better suited to the AR experiences being created by developers using the ARKit development platform.
The laser system would have big implications for the iPhone camera, too. Laser autofocus systems offer a faster and more accurate way of measuring objects in the frame compared to the more passive means used in iPhones now. The laser sends out light beams that bounce off objects and return to the sensor to indicate the precise depth of field of each. The camera lens can then focus on the desired aspect of a shot in milliseconds. Laser autofocus systems are already used in smartphones from Google, Huawei, OnePlus, and Asus.
Current iPhones use another kind of autofocus called phase detection autofocus, which was introduced with the iPhone 6 in 2014 under the name “Focus Pixels.” In general terms, phase detection systems compare the light rays coming into opposite sides of the camera lens, then compare them to determine whether the lens is focusing too close or too far away. The differences in the colors are used to constantly improve the focus of the camera.
The iPhone 8 arrives on the 10th anniversary of the iPhone, and Apple clearly wants it to represent a new chapter in the life of the device. The new phone is expected to pack several new features that are brand new to iPhones, including wireless charging, an edge-to-edge OLED display, and–possibly–sealed buttons on the side of the phone that respond with haptic feedback and are completely waterproof. And, as we reported yesterday, some of these new technologies are difficult to implement.
Nor are they cheap. The iPhone 8 is widely expected to cost hundreds morethan earlier iPhones.
The “iPhone 8” will be one of three new iPhones announced by Apple in the fall. The other two will be the successors to the current iPhone 7 and 7 Plus, and will likely be called the iPhone 7S and 7S Plus.
Source: fastcompany