How Phase Detection Autofocus Works

When it comes to DSLR technology, there seems to be quite a bit of confusion on how exactly phase detection autofocus works. While for most people this might not be a topic of great interest, if you are wondering how and why a camera could have an autofocus problem, this article will shed some light into what happens inside the camera in terms of autofocus when a picture is taken. There is an overwhelming amount of negative feedback on autofocus issues on such fine tools as the Canon 5D Mark III, Nikon D800, Pentax K-5 and other digital SLR cameras and it seems like most photographers do not seem to understand that the underlying problem is not necessarily with a specific model or type of a camera, but rather with the specific way these cameras acquire focus. If you search on the Internet, you will find thousands of autofocus reports on all kinds of DSLRs dating back 10+ years. Hence, the front focus and back focus issues we see in modern cameras are not anything new – they have been there ever since the first DSLR with a phase detect sensor was created.

How DSLR Cameras Work

To understand this issue in more detail, it is important to get to know how a DSLR camera works first. The typical DSLR illustrations only show a single reflex mirror positioned at a 45 degree angle. What they don’t show, is that there is a secondary mirror behind the reflex mirror that reflects a portion of light into a phase detect sensor. Take a look at the below simplified illustration that I made from a sample Nikon D800 image:

How Phase Detection Autofocus Works

Here is the description of each number shown in the above illustration:

  1. Ray of light
  2. Main/Reflex Mirror
  3. Secondary Mirror, also known as “Sub-Mirror”
  4. Camera Shutter and Image Sensor
  5. Eccentric pin (1.5mm hex) for adjusting the Main Mirror
  6. Eccentric pin (1.5mm hex) for adjusting the Secondary Mirror
  7. Phase Detect Sensor (AF Sensor)
  8. Pentaprism
  9. Viewfinder

Let’s take a look at what happens inside the camera when a picture is taken. Light rays enter the lens (1) and make it into the camera. A partially transparent main mirror (2) is positioned at a 45 degree angle, so it reflects most of the light vertically into the pentaprism (8). The pentaprism magically converts the vertical light back into horizontal and reverts it, so that you see exactly what you get when you look through the viewfinder (9). A small portion of light goes through the main mirror and gets reflected by the secondary mirror (3) that is also tilted at an angle (54 degrees on many modern Nikon cameras, as illustrated above). Next, the light reaches the Phase Detect / AF Sensor (7), which redirects it to a group of sensors (two sensors per AF point). The camera then analyzes and compares images from these sensors (similar to how focus is evaluated on a rangefinder) and if they do not look identical, it instructs the lens to make proper adjustments (see below for more details).

While the above process looks more or less straightforward, there is one major problem with this approach. The Phase Detect sensor is the one that instructs the lens to make proper adjustments, while the image is captured by a completely different device – the sensor on the back of the camera. Why is this a problem? Remember, when you take a picture, both the main and the secondary mirrors flip up, the shutter is opened and the light from the lens directly hits the camera sensor (4). For phase detection autofocus to work correctly, the distance between the lens mount and the camera sensor, as well as the distance between the lens mount and the Phase Detect sensor must be identical. If there is even a slight deviation, autofocus will be incorrect. On top of this, if the angle of the secondary mirror is not precisely what it should be, it will also result in autofocus issues.

How Phase Detect Sensor Works

As I have already said above, the phase detect system works similarly as rangefinder cameras. Light that gets bounced off the secondary mirror is received by two or more small image sensors (depending on how many focus points an AF system has) with microlenses above them. For each focus point you see in a viewfinder, there are two tiny sensors for phase difference – one for each side of the lens, as shown in the illustration on top of the page (7) (the illustration over-exaggerates this behavior by showing two separate light beams reaching two separate sensors. In fact, there are way more sensors than two on a modern phase detect device and these sensors are located very closely to each other). When the light reaches these two sensors, if an object is in focus, light rays from the extreme sides of the lens converge right in the center of each sensor (like they would on an image sensor). Both sensors would have identical images on them, indicating that the object is indeed in perfect focus. If an object is out of focus, the light would no longer converge and it would hit different sides of the sensor, as illustrated below (image courtesy of Wikipedia):

Phase Detection Autofocus

Figure 1 to 4 represent conditions where the lens is focused (1) too near, (2) correctly, (3) too far and (4) way too far. It can be seen from the graphs that the phase difference between the two profiles can be used to determine not just in which direction, but how much to change the focus to achieve optimal focus. Note that in reality, the lens moves instead of the sensor.

Since the phase detect system knows if an object is front focused or back focused, it can send exact instructions to the camera lens on which way to turn its focus and by how much. Here is what happens when a camera acquires focus on a subject (closed-loop AF operation):

  1. The light that passes through the extreme sides of the lens is evaluated by two image sensors
  2. Based on how the light reaches the image sensors, the AF system can determine if an object is front or back focused and by how much
  3. The AF system then instructs the lens to adjust its focus
  4. The above is repeated as many times as needed until perfect focus is achieved. If focus cannot be achieved, the lens resets and starts reacquiring focus, resulting in focus “hunting”
  5. Once perfect focus is achieved, the AF system sends a confirmation that the object is in focus (a green dot inside the viewfinder, a beep, etc)

All this happens in a fraction of time, which is why phase detection system is much faster than contrast detection system (which relies on changing focus back and forth until focus is achieved, with lots of image data analysis happening on the image sensor level).

The phase detection/AF system is a very complex system that sees improvements pretty much every time when a higher end camera line is refreshed. Over the years, the number of autofocus points have been increasing, as well as the number of more reliable cross-type autofocus points. For example, the Canon 1D X and the Canon 5D Mark III have a whopping 61 focus points, 41 of which are cross-type. Take a look at this complex matrix of autofocus sensors on the camera:

Canon EOS 1D X AF Layout

Not only have the number of AF points increased, but also their reliability. Most modern professional cameras today come with extremely fast and highly configurable autofocus systems that can continuously track subjects and acquire focus.

DSLR Autofocus Problems

As you can see above, the phase detection autofocus system is very complex and requires high precision to get accurate results. Most importantly, the phase detect/AF system must be properly installed and aligned during the manufacturing process. If there is even a slight deviation, which does happen quite a bit in manufacturing, autofocus would be off. This is the main reason why phase detect has been the source of problems pretty much since the first DSLR with a phase detect sensor came out. Understanding these possible deviations, all DSLR manufacturers developed a high precision calibration system that takes this into account and allows for individual camera calibration during the inspection and quality assurance (QA) process. If a phase detect sensor alignment problem is detected, the system performs automatic computerized testing that goes through each and every focus point and manually adjusts it in the camera. The points that are off are re-calibrated and re-adjusted, then the compensation values are written into the camera firmware. Think of this as a similar process to AF Fine Tune / AF Micro Adjust that happens on the phase detection level, except it is done for each AF focus point separately.

In my next article, I will talk about the Nikon D800 asymmetric focus issue (the left AF issue), why it happened and what Nikon is doing to fix it.


Support Photography Life!

We constantly work hard on adding unique content, gear reviews and up-to-date photography news, in addition to continuously expanding the site with new sections and useful content. However, we need your continuous support to deliver the best content and allow our website to expand its reach. If you would like to help us out, please consider purchasing gear from our links to our trusted partners like B&H Photo Video and Adorama. It won't cost you anything, but it will help us pay our contributors, hosting and other expenses to run this website. In addition, if you feel like we do a good job, you can pledge one-time or monthly. We do not run any advertising at Photography Life to keep it clean for your viewing pleasure, so your support is extremely important for us to keep it that way.

Please see the Support Us page for our partner links and a donation form. Also, don't forget to follow us on Facebook and Twitter!
Avatar of Nasim Mansurov About Nasim Mansurov

is a professional photographer based out of Denver, Colorado. He is the author and founder of Photography Life, along with a number of other online resources. Read more about Nasim here.

Comments

  1. 1
    ) Drazen B

    Another one of those fantastic articles from you, great and relevant read for both novice and pros alike.

    Thanks Nasim!

  2. 2
    ) GianCarlo

    I think that Nikon will hate you! Thank you!

  3. 3
    ) Mark Adams

    Thank you Nasim. Thousands of us look forward to that next post. Great work you are doing here!

  4. 4
    ) Oded Shopen

    I can’t believe how many times I used DSLR and didn’t know these fundamental facts. very interesting stuff, thank you very much for taking the time to write this!

  5. 5
    ) Sergio

    Great explanation, thanks a lot for such a good material.

  6. 6
    ) Judd

    Once again you have decomposed a potentially complicated subject into a concise and understandable lesson. Thank you for the effort that you put into this Nasim, it is greatly appreciated.

  7. 7
    ) Michael Baker

    Nasim– A GREAT read!! I am part engineering techno-geek and part photo artist
    and have never gotten over my fascination with the incredible technology we
    have today that provides detail and color definition in captured images which rivals
    the human eye.

  8. 8
    ) Don

    A masterpiece Nasim.. As usual, outstanding work… Hopefully this will put to rest the worriers and let them know that there is a fix. Again, great work…

  9. Well done. Thanks for the writeup.
    Allan

  10. 10
    ) Jorge

    Another fantastic post. You are one of the most prolific photography bloggers out there. I marvel at your energy and passion for the subject and how the quality of your information and writing are unwaveringly top-shelf with every post.

    As a D800 owner waiting for the return of his camera from the Melville service center for the left-focus issue, I also anxiously await your next article where someone I have come to trust puts rhyme and reason to this vexing problem.

    Thanks again

  11. 11
    ) Jano

    Nasim, thanks for the article! I’ve been into this stuff so it wasn’t new for me but I love how you break it down so almost everyone can understand the basic principles.

    Now I have one question I’ve been asking myself for a while, especially hearing about all those D800 issues. Since you kind of wrote about it in your last part I will take the chance to ask you.
    Wouldn’t it be possible to correct autofocus only via software? Basically telling the camera that focus point xy is front focused by n. So when aquiring focus the camera does not aim to get the exact same signal on both phase detect sensors, but rather aims for a front focus of n.

    If my thinking is right you could set up your camera in front of a large wallpaper with – say – one focus star for each AF point. Then run a software on the camera that automatically focuses point xy using contrast detection in LV. Now all the camera would have to do is simply read the value of the corresponding phase detection sensors to find out that this particular point is front focused by n.
    In theory, that software could automatically go through all the AF points and add a software compensation for any misalignment. This would eliminate the need for sending anything back to Nikon or rather Nikon should run this software on every camera during QA.

    Now, why isn’t it that simple? Is the phase detect only reliable enough in detecting the exact same signal rather than being able to detect a front focus of exactly n? I’m sure there’s some problem here why camera manufacturers don’t do this. On the other hand, isn’t this exactly what AF fine adjustments do on modern DSLRs?
    I don’t get it. In my head this is not as complicated as it appears to be in reality. Please help me out ;)

    On another note: This kind of automating AF fine adjustments using contrast detection should be included with every camera. Doing it by hand is so much more work…

  12. 12
    ) AK

    Nasim,

    2.Based on how the light reaches the image sensors, the AF system can determine if an object is front or back focused and by how much
    – I still couldn’t understand what it means by ‘Front focused’ and ‘Back focused’. Is there a way to make me understand better?

    • 17
      ) Axel

      Hi AK,

      This one is easy to figure out: Imagine you focus on an object (call it X), the camera tells you “X is in focus”, i.,e. the green dot in the viewfinder lights up instead of the left/right arrows.

      Take the shot. Now imagine that in the actual image taken, object X comes out blurred, i.e. not in focus.
      If objects in front of X are sharp, the camera exhibits a “front focus”.
      If objects behind X are sharp, the camera exhibits a “back focus”.

      Cheers,
      Axel

      • 18
        ) AK

        Thanks Axel, now this explanation made me to visualize the problem. Cheers.

  13. Yes!! A beautiful and elegantly written article! I have been trying very hard to understand this mechanism (what with the recent hoollabaloo on quality control) and it is with your help that I finally got it. Thank you, Nasim!

  14. 14
    ) MarkL

    A good read: Nikon D7000 Autofocus System Explained (http://bit.ly/MlWhnt)

  15. 15
    ) Joe

    Hi Nazim,

    Wonderful article and well written.. I wanted to understand this for a while!!

    couple of questions..

    1) what is the difference between vertical/hozontal/cross type points and why cross is better? is it the relative position of the microsensors and light sampling at that location??

    2) Is it better to align the focus point at the boundary of the surface for the phase detection to work faster? I mean, if I focus on the eye, is it better to focus at the edge of the white or at the middle of the white?

    thanks
    Joe

  16. 16
    ) Hoang

    Dear Nasim ,

    Great article , now I know why my D5000 always focus behind the subject !!!

    Tks a lot for this :)

  17. 19
    ) foo

    Thanks, great article!

  18. 20
    ) lorenzo

    Excellent article Nasim, thank you!

    I am anxious to read your next one to know if and how Nikon will address this issue. So far they deny it, however they do fix the cameras that are affected.

    In a waiting list at B&H for over a month, I will be happy if my D800E will arrive by Christmas, especially if this malfunction is corrected.

  19. 21
    ) Hooman

    Great explanation and illustration on a topic that I frequently searched the net for
    thank you Nasim

  20. 22
    ) Manzur Huq

    As usual, it is a great article.
    It reminded me of the Sir Barnes Neville Wallis, (26 September 1887 – 30 October 1979), who was an English scientist, engineer and inventor. He is best known for inventing the bouncing bomb used by the RAF in Operation Chastise (the “Dambusters” raid) to attack the dams of the Ruhr Valley during World War II. The raid was the subject of the 1955 film The Dam Busters, in which Wallis was played by Michael Redgrave. Among his other inventions were the geodetic airframe and the earthquake bomb. He also pioneered the concept of Hovercraft, Swing Wing fighter jets, and vertical Take off fighters.
    By putting two light beams on left and right tips of the each bomber wings pointed downwards, the exact elevation of the bombers were achieved for releasing the bombs, when the beams coincided on the water surface in front of the dams, the bombs then bounced on the water surface before hitting the side of the dams. The same way, as kids we played by throwing flat stones in ponds.
    This is off course used in the range finder cameras manually.
    The problem appears to be Manufacturing/Quality Control and not design, therefore, it should not be difficult for Nikon to fix this.
    Thanks a lot.
    Manzur

    • 24
      ) lorenzo

      Manzur,
      I hope you are not too optimistic on Nikon Quality Control. If you read what an “expert” (possibly a lawyer?) said on NikonRumors about this issue you probably will agree with him.

      He basically said that today it takes hours for each camera to be re-aligned by Repairs technicians and that Nikon can’t afford to waste these hours in production at the factory.

      Given the fact that few people are able to test and prove this malfunction (repairs wants a documented PDAF failure) Nikon still saves money on Quality Control as only a low percentage of cameras return to service.

      It is sad, makes one lose faith in the Great Nikon of the old days but it might be the reality today.
      lorenzo

  21. 23
    ) Ümit Alper TÜMEN

    Hello, Nasim;
    Thank you very much for very useful article. Before DSLR I mean SLR cameras; They have already used Phase Detect Sensor for AF. Is there any difference between SLR and DSLR camera Phase Detect Sensor for AF function. What about the sub-mirror angle of SLR (you indicate DSLR is 54 degree)
    Thanking you advance for your kind reply
    Best Regard,
    Ümit Alper TÜMEN

  22. 25
    ) Bijan

    very informative article.
    thanks a lot Mr Mansurov

  23. 26
    ) Wilba

    Congratulations on a nice clear article, and for correctly describing PD AF as a closed-loop control process. For an in-depth exploration of that aspect, check out http://www.dpreview.com/articles/5402438893/busted-the-myth-of-open-loop-phase-detection-autofocus

  24. 27
    ) Oyunu

    I do agree with all the concepts you have offered on your
    post. They are really convincing and can definitely work.
    Nonetheless, the posts are very short for newbies. May you please extend them a little from next time?
    Thanks for the post.

  25. 28
    ) lpef

    great article, very clear

  26. 29
    ) Maarten

    This is a really good article, i’m using parts of it for a Dutch version about this subject.

    • 30
      ) Don

      I hope you bothered to ask Nasim before using his information…

      • 31
        ) Maarten

        Not yet, but the article will not be published before I’ve done that.

  27. 32
    ) Shawn

    So how does autofocus work on mirrorless cameras?

  28. 33
    ) Keith R. Starkey

    You know, I am not the least bit technical, and after reading (well, skimming) past this article, I thought “What am I wasting my time reading this for? I’m never going to understand, or need, this.” And then I went back and re-read it, and it makes perfect sense. (So I’m quitting my job and going to ingenn…engeener…that kind of school…after I learn to spell!).

    Thanks for a great article!

Leave a Comment

*