iPhone 11 Deep Fusion Camera Explained
Apple, during last month’s event, announced that a new image processing feature called “Deep Fusion” is coming to the iPhone 11 and iPhone 11 Pro models at a later update. This feature enhances the quality of images the devices capture, resulting in greater detail.
But how does Deep Fusion really work? Does it work in tandem with the new Night Mode on the new iPhone models? Here’s a quick look at how the new image processing technology works, according to Apple Insider.
Fusion means “to combine”
Deep Fusion is a computational photography feature that works by combining multiple photos into a single shot, resulting in highly detailed images. The feature makes use of the A13 Bionic chip’s Neural Engine to determine the best features that should be included in the resulting image.
Phil Schiller, Apple SVP of Worldwide Marketing, explained last month that the feature takes “four shot images” and “four secondary images” before users push the camera trigger on the iPhone, then takes “one long exposure” when the user triggers the shutter. Don’t be fooled by the words Schiller used, though.
The “four short images” refer to images take with exposure values lower than what is normally used. These are taken to enable the iPhone to acquire sharpness values it needs when the processing the final photo. The “four secondary images,” on the other hand, refer to shots taken at normal exposure.
The Neural Engine selects the best combination of short exposures, and processes it along with the long exposure image for noise. It then runs a four-step process designed to analyze and refine the shots on a pixel-by-pixel basis.
Each step focuses on a specific portion of the image, giving each image subject a different level of attention and processing. It will identify the subjects or elements in specific parts of the images, and process them accordingly. These subjects or elements may include, for example, scenery such as the sky and clouds, or other details such as people and their clothing.
The Deep Fusion system ranks each element in the images so it would know what details to prioritize when combining the short and long exposure images into one final shot. The resulting image should be highly detailed in the areas that users will appreciate the most.
Deep Fusion isn’t officially out yet, but those who install the iOS 13.2 developer beta 1 can try it. It works in medium to low-light settings, but doesn’t work with Night Mode.
© Copyright IBTimes 2024. All rights reserved.