iOS10 hasn’t been available long, but iOS10.1 is already for developers. iOS10.1 introduces support for the “Portrait mode” in the iPhone 7 Plus that uses the dual cameras in the device to mimic the shallow depth of field effect that's typically only achievable with a high-end DSLR, with a front subject that stands out over a blurred background.
To achieve this look, Apple's built-in image signal processor scans a scene, using machine learning techniques to recognize the people in the image. From there, it creates a depth map of the image from both of the two cameras included in the device, keeping the people in focus while applying an artful blur or "bokeh" to the background.
According to TechCrunch, Apple's Portrait option was built on technology acquired from camera company LinX. Portrait mode is using the 56mm lens to capture the image while the wide-angle lens gathers perspective data to build the depth map and divide the image into layers.
Once it has this 9-layer slice, it can then pick and choose which layers are sharp and which get a gaussian (randomish) blur effect applied to them.
Once the telephoto lens detects the subject, using autofocus and other stuff we'll talk about in a second, the image processor inside the iPhone 7 will then apply blur in greater and greater amounts to the layers that are further away from that subject.
It's in beta, so there are some quirks Apple will need to work out. Apple has said that Portrait won't be used all the time, and it does appear to require good lightning and the right focusing distance between objects to function properly. It will take some experimentation to get good shots with Portrait.
Portrait mode is a new feature in the camera app that can be found alongside other video and photo taking options like "Video" and "Panorama." It even includes a Live Preview effect that lets you see what the image will look like before you take it, something that's unique to the iPhone.