When it comes to DSLR technology, there seems to be quite a bit of confusion on how exactly phase detection autofocus works. While for most people this might not be a topic of great interest, if you are wondering how and why a camera could have an autofocus problem, this article will shed some light on what happens inside the camera in terms of autofocus when a picture is taken. There is an overwhelming amount of negative feedback on autofocus issues on such fine tools as the Canon 5D Mark III, Nikon D800, Pentax K-5, and other digital SLR cameras and it seems like most photographers do not seem to understand that the underlying problem is not necessarily with a specific model or type of a camera, but rather with the specific way these cameras acquire focus. If you search on the Internet, you will find thousands of autofocus reports on all kinds of DSLRs dating back 10+ years. Hence, the front focus and back focus issues we see in modern cameras are not anything new – they have been there ever since the first DSLR with a phase-detect sensor was created.
How DSLR Cameras Work
To understand this issue in more detail, it is important to get to know how a DSLR camera works first. The typical DSLR illustrations only show a single reflex mirror positioned at a 45-degree angle. What they don’t show, is that there is a secondary mirror behind the reflex mirror that reflects a portion of light into a phase-detect sensor. Take a look at the simplified illustration below that I made from a sample Nikon D800 image:
Here is the description of each number shown in the above illustration:
- Ray of light
- Main/Reflex Mirror
- Secondary Mirror, also known as “Sub-Mirror”
- Camera Shutter and Image Sensor
- Eccentric pin (1.5mm hex) for adjusting the Main Mirror
- Eccentric pin (1.5mm hex) for adjusting the Secondary Mirror
- Phase Detect Sensor (AF Sensor)
- Pentaprism
- Viewfinder
Let’s take a look at what happens inside the camera when a picture is taken. Light rays enter the lens (1) and make it into the camera. A partially transparent main mirror (2) is positioned at a 45-degree angle, so it reflects most of the light vertically into the pentaprism (8). The pentaprism magically converts the vertical light back into horizontal and reverts it, so that you see exactly what you get when you look through the viewfinder (9). A small portion of light goes through the main mirror and gets reflected by the secondary mirror (3) that is also tilted at an angle (54 degrees on many modern Nikon cameras, as illustrated above). Next, the light reaches the Phase Detect / AF Sensor (7), which redirects it to a group of sensors (two sensors per AF point). The camera then analyzes and compares images from these sensors (similar to how focus is evaluated on a rangefinder) and if they do not look identical, it instructs the lens to make proper adjustments (see below for more details).
While the above process looks more or less straightforward, there is one major problem with this approach. The Phase Detect sensor is the one that instructs the lens to make proper adjustments, while the image is captured by a completely different device – the sensor on the back of the camera. Why is this a problem? Remember, when you take a picture, both the main and the secondary mirrors flip up, the shutter is opened and the light from the lens directly hits the camera sensor (4). For phase detection autofocus to work correctly, the distance between the lens mount and the camera sensor, as well as the distance between the lens mount and the Phase Detect sensor must be identical. If there is even a slight deviation, autofocus will be incorrect. On top of this, if the angle of the secondary mirror is not precisely what it should be, it will also result in autofocus issues.
How Phase Detect Sensor Works
As I have already said above, the phase-detect system works similarly as rangefinder cameras. The light that gets bounced off the secondary mirror is received by two or more small image sensors (depending on how many focus points an AF system has) with microlenses above them. For each focus point, you see in a viewfinder, there are two tiny sensors for phase difference – one for each side of the lens, as shown in the illustration on top of the page (7) (the illustration over-exaggerates this behavior by showing two separate light beams reaching two separate sensors.
In fact, there are way more sensors than two on a modern phase detect the device, and these sensors are located very close to each other). When the light reaches these two sensors, if an object is in focus, light rays from the extreme sides of the lens converge right in the center of each sensor (like they would on an image sensor). Both sensors would have identical images on them, indicating that the object is indeed in perfect focus. If an object is out of focus, the light would no longer converge and it would hit different sides of the sensor, as illustrated below (image courtesy of Wikipedia):
Figure 1 to 4 represent conditions where the lens is focused (1) too near, (2) correctly, (3) too far and (4) way too far. It can be seen from the graphs that the phase difference between the two profiles can be used to determine not just in which direction, but how much to change the focus to achieve optimal focus. Note that in reality, the lens moves instead of the sensor.
Since the phase-detect system knows if an object is front focused or back focused, it can send exact instructions to the camera lens on which way to turn its focus and by how much. Here is what happens when a camera acquires focus on a subject (closed-loop AF operation):
- The light that passes through the extreme sides of the lens is evaluated by two image sensors
- Based on how the light reaches the image sensors, the AF system can determine if an object is front or back focused and by how much
- The AF system then instructs the lens to adjust its focus
- The above is repeated as many times as needed until perfect focus is achieved. If focus cannot be achieved, the lens resets and starts reacquiring focus, resulting in focus “hunting”
- Once perfect focus is achieved, the AF system sends a confirmation that the object is in focus (a green dot inside the viewfinder, a beep, etc)
All this happens in a fraction of time, which is why the phase-detection system is much faster than the contrast-detection system (which relies on changing focus back and forth until focus is achieved, with lots of image data analysis happening on the image sensor level).
The phase-detection/AF system is a very complex system that sees improvements pretty much every time when a higher-end camera line is refreshed. Over the years, the number of autofocus points have been increasing, as well as the number of more reliable cross-type autofocus points. For example, the Canon 1D X and the Canon 5D Mark III have a whopping 61 focus points, 41 of which are cross-type. Take a look at this complex matrix of autofocus sensors on the camera:
Not only have the number of AF points increased, but also their reliability. Most modern professional cameras today come with extremely fast and highly configurable autofocus systems that can continuously track subjects and acquire focus.
DSLR Autofocus Problems
As you can see above, the phase detection autofocus system is very complex and requires high precision to get accurate results. Most importantly, the phase-detect/AF system must be properly installed and aligned during the manufacturing process. If there is even a slight deviation, which does happen quite a bit in manufacturing, autofocus would be off. This is the main reason why phase-detect has been the source of problems pretty much since the first DSLR with a phase-detect sensor came out. Understanding these possible deviations, all DSLR manufacturers developed a high precision calibration system that takes this into account and allows for individual camera calibration during the inspection and quality assurance (QA) process.
If a phase-detect sensor alignment problem is detected, the system performs automatic computerized testing that goes through each and every focus point and manually adjusts it in the camera. The points that are off are re-calibrated and re-adjusted, then the compensation values are written into the camera firmware. Think of this as a similar process to AF Fine Tune / AF Micro Adjust that happens on the phase detection level, except it is done for each AF focus point separately.
Since the main mirror allows light to pass through it, there is a loss of light and probably contrast to the sensor. You can see where that is heading….Another thought I generated yesterday was when you use a full frame lens on an APS-C sensor, some of the image is cropped. But since most lenses suffer from edge sharpness loss, what you’re left with is edge to edge sharpness on the sensor. Imagine a lens being reviewed and it having no loss of sharpness across ir’s throw. People would go crazy buying it. So why don’t camera companies design this into their systems, or, do they already do this and we still get edge softness because they only partially do it?
Interesting article, but it does not explain why separate calibration is needed for each lens. Can you explain why some lenses will focus without calibration while others will be off unless fine tuned?
What about AF calibration for EACH lens ?! How does Nikon cope with calibrating AF for each separately produced lens ? Body fabricated at A plant and Nikkor lens at B plant have to be coupled and calibrated TOGETHER for acurate AF performance. How is this problem fixed ?
Its a very good article, special thanks!!!
Thanks,
Sándor
Really great Article . Thank you so much .
It’s done by a method called ‘contrast detection’ by the image sensor. It is explained here in the below article.
photographylife.com/autofocus-modes
Dear Nasim,
Another excellent article from you. More than a thank to you. Not regarding to your topic in the article, I am wondering how the Liveview mode could work according to the DSLR illustration you showed ( I mean how LV mode could work when the secondary mirror as well shutter blocked the light to the image sensor) ??
It’s done by a method called ‘contrast detection’ by the image sensor. It is explained here in the below article.
photographylife.com/autofocus-modes
Thanks, this was clear and really helpful.
It is said it is easier to make things complicated while it is very tough to make things easy to understand complicated things. You have done a great job of explaining Phase detection in the common mans language. Thanks a million.
Prakash