Ever since Apple introduced the new iPhones, there has been a lot of curiosity about Deep Fusion. For those living on a non-tech planet, Deep Fusion is a new computational photography technology – or “computational photography mad science,” as Apple’s Phil Schiller referred to it in his typically understated manner, which was introduced in the iPhone 11 series (the iPhone 11, the iPhone 11 Pro and the iPhone 11 Pro Max). The feature was not initially available on the iPhones but has been rolled out via a software update – iOS 13.2.
In simple terms, Deep Fusion takes the whole concept of HDR photography to another level. In most HDR photography, e cameras take two or three photographs at different exposure levels and then combines them to create a single photograph, whereas Deep Fusion takes nine shots at different settings at once then puts them together, taking the best of each to give you a single superb shot.
It does so at an incredible speed, something Apple credits to the A13 processor. And this results in photographs that potentially could show better detail and lesser noise, especially when it comes to textured surfaces.
All of which sounds great on paper. The problem is that there is no way of really knowing when it is working. It just kind of works when the camera feels the need for it – a sort of background operator who switches from Bruce Wayne to Batman depending on the need of the hour, which is kind of understandable. But which also means that there is no real way of knowing when Deep Fusion has worked or when you have just got a very good shot by routine HDR (which incidentally is very very good as well – hey, this IS the iPhone).
Well, there is a way of making Deep Fusion work when you want it to – and it almost always works. No, it is not exactly intuitive but it is relatively simple to do.
We know that Deep Fusion does not work on the ultra-wide lens of the iPhone 11 series. So any shot that you take with the ultra-wide sensor will not use Deep Fusion. So stick to using the main sensor (1x) or the telephoto (2x). But that is not all. You also need to go into the camera settings and turn off the “Photos Capture Outside the Frame” option because that also uses the ultra-wide sensor, even if you are shooting from the main or telephoto sensor. Another thing to keep in mind is that while the mode can work with or without Smart HDR enabled, it does not work in burst mode shots.
That’s it, really. For those who would like it step by step: (just make sure your iPhone 11, 11 Pro, or 11 Pro Max is updated to iOS 13.2 or higher!)
- Go to Settings on your iPhone 11 series device.
- Select the Camera option
- In the Camera menu, turn off “Photos Capture Outside the Frame”
- Launch the camera app
- Take pictures either using the main sensor or the telephoto sensor – do NOT use the ultra-wide camera.
- In most cases, your results would use Deep Fusion technology!
Just remember Deep Fusion will not always make a massive difference to your shot. In fact, sometimes it will look just like your normal snap – you can compare the difference between a Deep Fusion picture and a normal one by simply turning on the “Photos Capture Outside the Frame” option in the camera menu, which will result in a Deep Fusion-less shot. But yes, sometimes the difference can be spectacular – we found this to be the case more in low light shots than in textured ones, to be honest.
Incidentally, there is a way of also finding out which of your iPhone photographs have been taken using Deep Fusion (checking which used the ultra-wide sensor and which did not can be a little tedious). Stay tuned for our next tip!