When Apple introduced the new iPhone 11 generation, the new camera system was promoted with a feature called "Deep Fusion." Now, the feature is expected to be released soon.
The Magazine The Verge Apple claims to have learned that it has a previously unreleased feature for the iPhone 11 and iPhone 11 Pro in the next iOS 13 beta. It's called "Deep Fusion." The feature is supposed to provide even better shots in low or poor light by further processing the image with the A13 Bionic chip. The feature works in the background, significantly improving image quality.
The Verge explains how Deep Fusion works as follows (quote):
- When you press the shutter button, the camera has already taken three images using a fast shutter speed to freeze the motion in the shot. When you press the shutter button, it takes three additional shots and then a longer exposure to capture details.
- These three regular shots and the long exposure are merged into what Apple calls a “synthetic long” – a big difference from Smart HDR.
- Deep Fusion selects the short exposure image down to the smallest detail and blends it with the synthetic long exposure - unlike Smart HDR, Deep Fusion only blends these two images - nothing more. These two images are also processed for noise differently than Smart HDR, which is better for Deep Fusion.
- The images go through four stages of detail processing, pixel by pixel, each one tailored to increasing levels of detail - skies and walls are at the lowest level, while skin, hair, fabrics and so on are at the highest level. This creates a series of weights for the blend of the two images - detail from one and tone, tone and brightness from the other.
- The final image is created.
Exactly which beta this will be remains unclear for now – "Deep Fusion" will likely be released with the iOS 13.2 beta. The new developer preview should be released soon. (Photo by Vershinin89 / Bigstockphoto)




