• Skip to primary navigation
  • Skip to content
  • Skip to primary sidebar
  • Skip to footer

Photography Life

PL provides various digital photography news, reviews, articles, tips, tutorials and guides to photographers of all levels

  • Click to visit our RSS feed
  • Click to visit our Facebook page
  • Click to visit us on Twitter
  • Click to visit our YouTube channel
  • Click to visit our Instagram page
  • Reviews
    • Camera Reviews
    • Lens Reviews
    • Other Gear Reviews
  • Learn
    • Beginner Photography
    • Landscape Photography
    • Wildlife Photography Tips
    • Portraiture
    • Post-Processing
    • Advanced Tutorials
  • Forum
  • Photo Spots
    • Photo Spots Index
    • Submit a Photo Spot
  • Lenses
    • Lens Index
  • About Us
    • Contact Us
    • Workshops
    • Subscribe
    • Submit Content
  • Shop
    • Cart
    • Support Us
  • Search
  • Log In
  • Sign Up
Home » Photography Tutorials » What Is Lens Diffraction?
  • Click to share on Facebook (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on Pinterest (Opens in new window)
  • Click to share on Reddit (Opens in new window)

What Is Lens Diffraction?

Spencer Cox79 Comments

When photographers talk about lens diffraction, they are referring to the fact that a photograph grows progressively less sharp at small aperture values – f/16, f/22, and so on. As you stop down your lens to such small apertures, the finest detail in your photographs will begin to blur. With good reason, this effect can worry beginning photographers. However, if you understand how diffraction impacts your photographs, you can make educated decisions and take the sharpest possible photographs in the field. In this article, we will explore the topic of lens diffraction in detail and talk about different techniques you can utilize to avoid it.

The effects of diffraction – that your sharpness decreases at smaller and smaller apertures – are shown in the comparison below. Keep in mind that these are fairly extreme crops:

The Kiss diffraction
(To see the sharpness differences more clearly, click on the image. Pay particular attention to the pattern of colored dots on the woman’s face.)

The reason that this occurs is based upon the principles of physics; in short, as the aperture gets smaller and smaller, light waves spread out and interfere with one another increasingly more. This causes small details of your photographs to blur.

However, this explanation is overly simple, and it still can be confusing to beginning photographers. What, physically, causes diffraction? At what point does diffraction begin to blur your photographs? Is there anything you can do to prevent diffraction? Are expensive lenses better at controlling diffraction? The answers to all of these questions will be explained in-depth below.

1) What is Diffraction?

In explaining diffraction, it can be difficult to straddle the line between avoiding and embracing references to optical physics. Most photographers are interested in day-to-day knowledge rather than comprehensive background information, but it is impossible to talk about diffraction without describing how it works at a fundamental level. That said, this section is meant to be understandable even if you are not a physicist; we recommend reading it, since it will provide a more solid foundation for your understanding of diffraction.

At its most basic, diffraction is the concept that waves – including light waves – can interfere with one another. In fact, every time that waves pass through a slit, they will interfere. To make this easy to visualize, consider waves of water. If you drop a rock into a perfectly still lake, you will cause a ripple of small waves to form. These waves spread out in concentric circles, just like the image below:

Puddle Wave Diffraction
(Image adapted from Wikimedia Commons)

What happens if you create a barrier to block the path of these waves? Quite simply, you would stop their movement. This is boring:

Still water purple
(The waves on the left-hand side, of course, would continue to bounce around; that isn’t shown in this diagram.)

To make it interesting, then, you cut a hole in the barrier so that water can pass. Now, what sorts of patterns would the waves create?

Question water purple

The waves look similar to how you might expect, although there are a few additional patterns that form aside from the primary wave:

Purple Line Wave
(Note that this diagram is slightly simplified. In the real world, you would only see the exact pattern of waves on the right-hand side if the incoming waves were perfectly parallel.)

These additional patterns are artifacts from the wave bending around the corners. They arise because the two corners act, essentially, as individual sources of waves – waves which can collide with one another. In certain areas of collision, the waves cancel each other out (destructive interference); that is why some areas of the diagram look completely still. In other places, though, the waves add together (constructive interference), which causes an additional pattern to form off to the sides.

To visualize this, let’s say that there is a sensor along the far-right edge of the diagram. This sensor measures the intensity of the waves at a given point, which increases with the amplitude of the wave. A graph of the intensity is shown below:

Single Slit Diffraction Graph

Clearly, the central pattern is the most significant. The patterns off to the side are still present, but they don’t have nearly the same intensity as the one in the center. This means that the central pattern is most significant in photography, as we will cover in a moment. For now, though, let’s see what happens with a large versus a narrow opening in the barrier. Note that the images below have been simplified, and only the central wave pattern is included:

Aperture comparision

The main difference between these two images is that the smaller opening results in a larger spread of waves, while the large opening causes much less spreading.

Take a look at a comparison between the graphs of the two waves:

Comparison of graphs

Although it may initially seem unusual that a small opening leads to a larger spread of waves, the illustrations above should show that it makes logical sense. Essentially, larger openings allow the waves to pass without much interference. Since the waves are not particularly disturbed, they follow a relatively straight path towards the edge of the pool. Smaller openings, though, affect a wave more significantly, causing it to bend at harsher angles. (This is a slight simplification; for more technical information, I recommend reading the Wikipedia page on the Huygens principle.)

Finally, note that a “small” opening is relative. In fact, the opening only causes diffraction when it is similar in size to the wavelength that passes through it. This is why light, which has a tiny wavelength, will not diffract significantly if it passes through a ten-foot wide opening – even though the ocean does.

Congratulations! You now understand the physics of diffraction. At its most basic, a small opening causes waves to bend and interfere with one another; this, in turn, spreads out their signal.

2) Diffraction in Photography

Clearly, diffraction is an important concept in physics. In fact, a similar experiment (with two slits rather than one) played a major role in proving that light can behave as a wave – one of the most important discoveries in scientific history. But how does this impact your everyday photography?

Aperture blade diffraction photo
(Image from Wikimedia Commons)

It all comes down to the aperture of a lens. Shown in the photograph above, the aperture blades in a lens act as a single slit that passes waves of light. A pattern of the light’s intensity is exactly what you would expect to see:

Single Slit Diffraction Pattern
This looks familiar! That’s because light, similar to water, travels in waves. (Image from Wikimedia Commons.)

This, though, is a two-dimensional graph. In the real world, a pinpoint of light projects in three dimensions. So, a more accurate graph appears below:

3D Airy Disk
(Image from Wikimedia Commons.)

This three-dimensional pattern occurs every time that light shines through the aperture in your camera lens. When projected onto the sensor of your camera, it looks like this:

Airy Disk
(Image from Wikimedia Commons.)

The figure above shows what is known as an Airy disk. This is, quite simply, the appearance of a diffraction pattern when it hits your camera sensor. The central region is the brightest, and it has the largest effect on your photographs.

It isn’t difficult to tell why this Airy disk can cause a photograph to blur. We already know that a small opening – or, a small aperture – causes waves to spread out. This means that, at small apertures, the Airy disk becomes much larger. If you can envision the Airy disk as hitting your camera sensor, you get a picture that looks like this, where the grid represents the pixels on your sensor:

Sensor Airy Disk
(Note that, in reality, the Airy disk grows dimmer as the aperture grows narrower; to simplify the diagram, this effect is not shown here.)

Now, think of a scene as being composed of countless tiny sources of light. Every pinpoint of light travels through the aperture of your lens; as a result, each part of your photograph projects onto your sensor as an Airy disk. These, as shown above, become blurrier with small aperture values. This is the reason that you see diffraction!

3) High- Versus Low-Megapixel Cameras

The comparison above, showing an Airy disk hitting the pixels of your sensor, might prompt a question: if the pixels were larger, wouldn’t the Airy disk be less likely to bleed over?

In fact, that is completely true! Large pixels – those which are bigger than the Airy disk – do not show diffraction at the same apertures that a small-pixel camera would. Perhaps I can stop down to f/11 on the 12-megapixel Nikon D700 before noticing any diffraction, while the 36-megapixel D800/D810 would show visible diffraction at any aperture smaller than f/5.6. These numbers aren’t set in stone, though; I recommend testing your own camera to see when diffraction begins to grow noticeable (and, more importantly, when it begins to grow objectionable).

However, this isn’t a problem with high-resolution sensors. In fact, if all your settings are the same, a high-resolution sensor will always capture more detail than a low-resolution sensor of the same size. More pixels will never lead to lower detail, even at the tiniest of apertures. This means that, if you print your photos at the same size, a Nikon D800/D810 photo will always have more detail than a Nikon D700 photo, all else equal.

That said, if you buy the Nikon D800/D810, chances are good that you want to print large or pixel-peep. If this is the case for you, diffraction absolutely is a bigger issue than it would have been with a low-resolution sensor! To get the best possible sharpness from a D800/D810, you should pay attention if your aperture is smaller than about f/8. Again, I recommend testing the exact boundaries of your camera yourself.

Breaking
NIKON D800E + 105mm f/2.8 @ 105mm, ISO 100, 1/3, f/7.1

4) Small Versus Large Sensors

It is often said that crop-sensor cameras (i.e., DX Nikon cameras) show diffraction more easily than full-frame cameras (FX Nikon). Is this a myth, or does it hold true?

Let’s start with what we know. At a given aperture on a lens, the Airy disk will always be the same physical size. It doesn’t matter what sensor you use; this is a property of physics that only depends upon the aperture itself. For example, whether I put a 50mm f/1.8 lens on the full-frame D750 or the crop-sensor D3300, the size of its Airy disk projection will be identical (assuming the same aperture).

So, where’s the confusion? The issue irises from the fact that the same Airy disk takes up a larger percentage of a crop-sensor camera than a full-frame camera. Take a look at the example below:

Crop vs full frame airy comparison

In fact, at an equal print size, a DX camera will show more diffraction than an FX camera. This is because the DX sensor is essentially a crop of the FX sensor; in other words, it magnifies everything in your photograph – including the diffraction – just like cropping in post-production.

The amount of additional diffraction is the same as your crop factor. So, for a 1.5x crop-sensor camera, multiply your aperture by 1.5 in order to see the equivalent diffraction on a full-frame camera. For example, the Airy disk at f/11 on a DX camera takes up roughly the same percentage of your sensor as the Airy disk at f/16 would on a full-frame camera.

Of course, if you use a DX camera, you may not print quite as large as you would with an FX camera. For many photographers, then, there is no practical difference; the smaller prints from a DX camera cancel out the additional diffraction. If you do print at large sizes with a DX camera, be aware that diffraction will be more significant at a given aperture.

Beach Sunrise
NIKON D7000 + 24mm f/1.4 @ 24mm, ISO 100, 1/250, f/5.6

5) Diffraction and Depth of Field

Diffraction decreases a photograph’s sharpness at small apertures. Yet, at the same time, small apertures increase the amount of depth of field in a photograph. This is not a contradiction, although it can be confusing at first. Look, for example, at the comparison below:

Depth of Field Comparison

As you can tell, the f/22 photo has much more of the scene within its depth of field. If I want this entire subject to be sharp, it is far better than the photograph at f/5.6. However, let’s look at the point of focus more closely:

Cropped Lizard Diffraction

As you can see, the f/5.6 photo is significantly sharper. (Click on the image to see it more clearly.)

This, of course, does not mean that you should shoot every photograph at f/5.6. If you need a large depth of field, feel free to use smaller apertures; sometimes, it’s worth the slight reduction in sharpness from diffraction.

6) Choosing the Sharpest Aperture

There is always diffraction at every single aperture of your lens. This has to be true; light always needs to bend through an aperture, even if it is very large. However, at wide apertures like f/2.8 or f/4, the Airy disk is much smaller than the pixels in your photograph. This means that diffraction is essentially impossible to see at such large apertures.

However, this doesn’t mean that large apertures are the sharpest on a given lens. As you likely know, a lens tends to be at its sharpest when its aperture is slightly stopped-down. For example, my 20mm f/1.8 lens is sharpest in the center at f/4. Below is a sharpness chart for such a lens:

So, why is the peak at an aperture of f/4 rather than f/1.8? That is slightly beyond the scope of this article, but the essence is that – at larger apertures – more light travels through the edges of a lens. Since the center of a lens is the best-corrected region, this decreases the sharpness of the photograph (and increases its spherical aberration). A smaller aperture actually blocks light that has traveled through the edges of a lens, which improves the sharpness of a photograph.

This effect, balanced with the decrease of sharpness from diffraction, is the reason that f/4 gives the greatest sharpness on a lens like the 20mm f/1.8.

How do you tell which aperture is sharpest on your lens? Simply look at the tested results online. However, don’t stress too much about always shooting at the “perfect” aperture. For one, even these test results can be ambiguous. In the chart above, for example, the corners of the lens are actually sharpest at f/8. So, depending upon your subject, you may prefer sharper corners rather than the sharpest possible center.

At the same time, even suboptimal apertures aren’t horribly blurry. I have made a few large prints from photographs taken at f/16, and their quality is more than enough for my needs. If you need an aperture like this – generally to increase your depth of field – don’t be afraid to use it.

(If you need the largest possible depth of field in a photograph, like many landscape photographers, I recommend reading about hyperfocal distance. There are many similarities between these two properties of photography.)

Castle
NIKON D800E + 24mm f/1.4 @ 24mm, ISO 100, 6/10, f/16.0

7) Avoiding Diffraction

Now that you understand diffraction, how do you make sure to avoid it in your photographs? Unfortunately, the simple answer is that you can’t. Diffraction is a result of physics. It doesn’t matter how good your lens is; diffraction will rob sharpness at smaller apertures no matter what.

Even though you cannot circumvent the laws of physics, there is one way to avoid diffraction in your photographs: use a larger aperture. If you need the absolute sharpest photograph, this is the only way to avoid the effects of diffraction. Are you photographing a scene that needs a large depth of field? Try focus stacking at an aperture of f/5.6 or f/8, where diffraction is minimal.

At the same time, if you did use a small aperture (say, f/16 or f/22), you can improve a photograph’s apparent detail by sharpening in post-processing. This doesn’t actually eliminate the effects of diffraction, but it is a simple way to improve photos taken at small apertures.

In theory, it is possible to correct for diffraction through a sharpening process known as deconvolution sharpening. This type of sharpening is most effective when one has a perfect model of the lens in question, including its exact optical characteristics. For this reason, generic deconvolution sharpening does not reduce the effects of diffraction to a meaningful degree; NASA, however, is known to use such a method to improve the sharpness of Hubble Telescope photographs. (Some camera manufacturers, including Pentax, may have a diffraction-reduction menu option; however, this is nothing more than a standard unsharp mask cooked into your RAW file.) If you want to test deconvolution sharpening, increase the “Detail” slider as much as possible in either Lightroom or Camera Raw. Of course, it will not be specific to your lens, which would be necessary for true diffraction reduction.

However, although you can sharpen your photographs in post-processing, the best way to decrease diffraction is simply to use a larger aperture.

Last Light on Half Dome
NIKON D7000 + 105mm f/2.8 @ 105mm, ISO 100, 1/40, f/6.3

8) Extra Information

Aperture is a technical topic; so is the interaction between light and your camera sensor. Some of the information above is presented as a best-case scenario, and the reality can be slightly more complex. Most of the following information will not affect the actual appearance of your photographs, but it is worth covering some of these special cases.

For example, light with large wavelengths will diffract more readily than light with shorter wavelengths; this means that red light (with a wavelength of about 650 nm) leads to a larger Airy disk than blue light (about 475 nm) at the same aperture. So, in theory, you will see slightly less blur from diffraction if you are working in extremely blue light; in practice, this effect is small enough that it has little impact on your photographs.

Also, in most cameras, the pixels that combine to make a photograph do not all detect the same wavelengths of light. For sensors with a Bayer array of pixels (including Nikon, Canon, and Sony DSLR/mirrorless cameras), the number of green-sensing pixels is twice the number of red and blue pixels. This means that the pixel diagram presented earlier is a slight simplification; however, this does not change the fact that blur from diffraction increases due to the size of the Airy disk.

Finally, the depiction of the Airy disk in this article is bit simpler than it would appear in the real world. Above, I showed it as a series of concentric rings; in reality, though, that would only occur if the aperture were perfectly circular. Most lenses have seven, eight, or nine aperture blades, which (even when curved) are not quite circles. So, the “Airy disk” becomes an “Airy octagon.” However, there is no practical difference in the appearance of diffraction in your photographs; your photos will be just as blurry as you stop down the lens.

If you have any questions about the finer points of diffraction, please feel free to ask a question in the comments section; a single article is too short to explain everything that there is to know about such a complex topic.

Beach Falls
NIKON D7000 + 17-55mm f/2.8 @ 55mm, ISO 100, 1/250, f/5.6

9) Conclusion

Given all of these technical caveats, diffraction can seem like an out-there, unusual topic to be discussing. However, its effects are clear and significant in your photographs, and they are well worth considering while you are taking pictures. Especially for landscape and architectural photographers – or anyone who wants to take sharp photos with a large depth of field – it is important to understand the trade-offs that come from shooting at a small aperture.

Diffraction is present in all your photographs, and – if you aren’t careful – it can rob some sharpness from your favorite images. However, once you see its effects in practice, diffraction will become second nature.

Subscribe to Our Newsletter

If you enjoyed reading this article, please consider subscribing to our email newsletter to receive biweekly emails notifying you of the latest articles posted on the website.

Related articles:

Disclosures, Terms and Conditions and Support Options
Filed Under: Photography Tutorials Tagged With: Advanced Photography Tips, Aperture, Lenses, Optics, Lens, Diffraction

About Spencer Cox

Spencer Cox is a landscape and nature photographer who has gained international recognition and awards for his photography. He has been displayed in galleries worldwide, including the Smithsonian Museum of Natural History and exhibitions in London, Malta, Siena, and Beijing. To view more of his work, visit his website or follow him on Facebook and 500px. Read more about Spencer here.

Reader Interactions

Comments

  1. good old PL follower
    April 4, 2016 at 7:10 am

    And why not relating this article to hyperfocal lenght?

    Reply
    • Spencer Cox
      April 4, 2016 at 12:43 pm

      Good thought. I added a bit about hyperfocal distance under the “Choosing the Sharpest Aperture” section. I’m also planning to write an article that covers the sharpest possible settings for a landscape with a wide depth of field — that will tie these two articles together more closely.

      Reply
    • Carsten
      April 5, 2016 at 5:45 am

      As Aaron D. Priest already pointed out in his comments to Spencer’s article explaining hyperfocal distance (photographylife.com/lands…-explained) there is nice trilogy of apps available for the iPhone provided by George Douvos which ties this all together: depth of field, diffraction, and focus stacking.

      Reply
  2. Beyti
    April 4, 2016 at 7:40 am

    Thank you for the wonderful article Spencer. This is directly related to the photography I do. I take New York City panoramic photographs mostly. Here is my panoramic portfolio

    art.8thrulephoto.com/NYC-p…index.html

    I have no problem with single shot images but when I make panorama, I`m always trying to find a way to make it as sharp as I can. Do you think f8-f11 will be enough to get everything sharp if my subject is half a mile away? When I use DOF calculator, it tells me that when I put the focus point on a point half a mile away at f8, anything from 453 feet to infinity is sharp. I was always using smaller aperture like f16 to get everything sharp. I guess based on this calculation I don`t need to use f16. I can use f8 to prevent diffraction and also get all NYC skyline sharp.
    What do you think?

    Reply
    • Nasim Mansurov
      April 4, 2016 at 8:21 am

      If your nearest foreground object you want in focus is far enough to be at infinity focus, you should always use the optimum aperture of your lens. Whether you shoot panoramas or a single shot does not matter – you focus once when doing panoramas anyway. As Spencer pointed out, find what the optimum aperture is (typically in the f/4-f/8 range) for your lens and always to try use it as the default aperture…

      Reply
      • Beyti Barbaros
        April 4, 2016 at 8:28 am

        Perfect…Thank you Nasim.

        Reply
        • Chris
          April 12, 2016 at 11:53 am

          Beyti Consider though – choosing F8 will not render those star effects that your night pictures have…but sharpness will definitely improve especially if you use a high megapixel camera as the D800series. stars are rendered when you choose a high f-number (small aperture).
          You can take multiple exposures at different apertures and blend those stars in in photoshop.

          Beautiful portfolio you have there !

          Reply
      • Kathleen
        June 19, 2017 at 6:55 am

        Helpful article and comment, thanks y’all.

        Reply
  3. Siddhant Sahu
    April 4, 2016 at 9:04 am

    Wow…never read such a detailed article on “Diffraction”, involving so much physics, nice research Spencer.

    Reply
    • Spencer Cox
      April 4, 2016 at 12:43 pm

      Thanks, Siddhant!

      Reply
  4. Merlin Marquardt
    April 4, 2016 at 9:09 am

    Good explanation of a difficult subject.

    Reply
    • Spencer Cox
      April 4, 2016 at 12:43 pm

      Thank you, Merlin! It was a difficult subject to research, too :)

      Reply
  5. John D
    April 4, 2016 at 9:16 am

    Thanks for a great article. I understand this phenomenon from doing experiments with a laser, but it’s wonderful to see such a good technical article explain the subject so well.

    Reply
    • Spencer Cox
      April 4, 2016 at 12:46 pm

      Thank you! I saw a lot of references to lasers as I researched for this article — I haven’t worked with any in person, though.

      Reply
  6. Arthur
    April 4, 2016 at 9:17 am

    Hey Spencer, nice that you wrote such an extensive article about diffraction.

    However, this image is incorrect: cdn.photographylife.com/wp-co…60×556.jpg

    That pattern that arises on the right is when there are parallel waves directed to it (i.e. a source from very far, or something like a laser). On this image it is correct, and you do see the parallel waves: cdn.photographylife.com/wp-co…60×418.jpg

    Reply
    • Spencer Cox
      April 4, 2016 at 9:24 am

      Thank you for including this information! I drew it how I did in the article just to make it easier to visualize a wave, but you are absolutely correct; the diagram is most accurate if the incoming wave is straight rather than curved. I will add a note in the article as soon as I get a chance.

      Reply
    • Pete A
      April 6, 2016 at 1:46 pm

      Arthur, I understand the essence of what you wrote, but being the scientific pedant that I am: Planar waves do not, and cannot, exist in the object space of practical reality.

      Lasers do not, and cannot, produce planar waves because their light output passes through their finitely-sized exit aperture. In order to produce true planar waves — a perfectly collimated beam — the required size of the exit aperture is infinite. Only an infinite diameter circular aperture exhibits zero diffraction thereby enabling it to emit waves having a curvature of zero.

      Reply
  7. Simon
    April 4, 2016 at 9:20 am

    Thanks for this interesting article. I am just wandering about absolute vs relative aperture size: does diffraction effect depend on the actual size of the hole (expressed in mm) or on the size of the hole expressed relatively to the focal length (expressed as f/5.6, f/8). In other words, do a 50mm and a 100mm involve the same diffraction level at the same f value, or does the diffraction effect obtained with a a 50mm at say f/4 is the same as the one obtained with a 100mm at f/8, as both involve the same absolute hole size.
    Thanks

    Reply
    • Spencer Cox
      April 4, 2016 at 9:30 am

      Thanks for bringing this up – it can get confusing. In fact, the “relative” aperture is the only one that matters in terms of diffraction. In other words, you always see the same amount of diffraction blur at f/16, whether you use an 11mm lens or an 800mm lens.

      Reply
    • Pete A
      April 4, 2016 at 12:16 pm

      Simon, The size of the Airy disk is actually an angle therefore its diameter linearly increases with distance away from the aperture diaphragm and towards the image plane. E.g., we have two lenses both focussed at infinity: f₁=50 mm at f/5 (aperture diameter D₁=10 mm); and f₂=500 mm f/5 (aperture diameter D₂=100 mm). The 10 mm aperture produces an Airy disk that’s ten times the angle of the 100 mm aperture, however, it has to travel only 50 mm to the image plane compared to 500 mm for the longer lens. The diameter of the disk at the image plane will be exactly the same for all lenses set to the same working f-number (Nw), regardless of their focal length.

      Note: Nikon cameras display Nw rather than the actual f-number (N) of the lens, which is especially useful for macro photography where Nw is significantly greater than N: this difference is often termed the “bellows factor”.

      Reply
      • Nasim Mansurov
        April 4, 2016 at 2:18 pm

        As usual, excellent technical explanation and commentary Pete, thank you for pitching in and helping out!

        Reply
        • Pete A
          April 4, 2016 at 3:29 pm

          Thank you, Nasim. I really enjoyed reading Spencer’s thoughtfully illustrated article to explain this difficult-to-understand, but essential, topic.

          Absolutely no criticism intended towards anyone: please change all occurrences of the word “airy” to “Airy” in the article to help readers differentiate between “a spacious, well lit, and well ventilated room” and the English Astronomer Royal, Sir George Biddell Airy :-)

          Reply
          • Spencer Cox
            April 4, 2016 at 3:50 pm

            Ah, thank you Pete – I should have caught that one! I changed the capitalizations.

            Reply
  8. Ludwig Keck
    April 4, 2016 at 12:40 pm

    You have recommended that photographer do their own tests. That is certainly good advice, but leads to the question “how do I test?” I have used, and recommended, a quick and dirty methods that gives a good idea of the “best” aperture: Find a distant object with a lot of detail. I like trees in winter. Set camera on a tripod. This not only avoids shaking the camera but also makes sure that the camera is not moved for the series of exposures. Focus manually. Set the camera for fixed ISO and aperture preferred. Take exposures at different apertures letting the camera set the shutter speed.

    The “sharpest” photo will have the most detail. The photo with the most detail will have the largest file size. So you can tell the best aperture setting by just looking for the photo of the largest file size.

    No, this is not the best testing method, it is just quick and easy.

    Reply
    • Spencer Cox
      April 4, 2016 at 12:51 pm

      Interesting! I didn’t know that method could work; I’ll need to try it out at some point. I feel like it could run into errors as vignetting and aberrations decrease at smaller apertures, although — as you say — it is more about quick approximation than complete accuracy.

      My personal method would be to do exactly what you suggest, then simply study and compare the photos side-by-side at 100% in Lightroom. Of course, that takes some extra time.

      Thanks for your comments!

      Reply
    • Kathleen
      June 19, 2017 at 6:59 am

      Helpful article and comment, thanks y’all. I shot a brick wall using a tripod 8-10 feet out and shooting at each f stop. At f/22 there was definite diffraction, and my f/2.8 was a bit soft but usable.

      Reply
  9. PALM
    April 4, 2016 at 1:42 pm

    Very nicely done Spencer! I will only mention that not only NASA learnt how to combat diffraction but also modern day microscopy (e.g. PALM), providing images far below diffraction limit (Airy disk size) – images of cellular structures at spatial resolution of just few nanometers – or allowing tracking spatiotemporal behaviour of single molecules in live cells – simply learn their function…

    Reply
    • Spencer Cox
      April 4, 2016 at 6:53 pm

      Very interesting, thanks! I just read about PALM on Wikipedia — it looks like an incredibly useful technique for microscopy. I guess it doesn’t work for standard photography, though :)

      Reply
      • PALM
        April 4, 2016 at 8:43 pm

        It does not, PALM will only work if you have a total control over thing you are imaging, you literally tell the thing when to light up in a sequential manner, molecule by molecule, one molecule at the time, and when you finally imaged all molecules you eliminate diffraction in each image (for each molecule) and reconstruct your superresolution image, but you sacrifice temporal resolution of course.

        Reply
        • Spencer Cox
          April 7, 2016 at 7:58 pm

          That’s incredibly interesting, thanks for sharing!

          Reply
  10. Jan Stuck
    April 4, 2016 at 4:34 pm

    Great article that was missing from this wonderful website for a long time. The explanation along with photos was great and helps a lot! Maybe we can soon have an article about focus stacking (specifically for landscape photography as it is an obvious topic for macro-photography). That can help folks that want to avoid diffraction but still give the opportunity of getting immense depth of field.

    Thank you for all your time you put into the website along with all the other colleagues from photographylife.

    Reply
    • Spencer Cox
      April 4, 2016 at 6:49 pm

      Thank you, Jan! I agree, a focus stacking article for landscapes would be useful — thanks for the suggestion.

      Reply
  11. Eric Bowles
    April 4, 2016 at 5:03 pm

    Good article, Spencer.

    One thing readers should keep in mind is this article is just referring to diffraction as one dimension of sharp image making. The other thing to keep in mind is that there are other factors in addition to diffraction that impact sharpness. For example, some lenses are sharper near wide open while others are better at f/8 or so. Most lenses deteriorate in sharpness as you stop down more – to f/11 or more. In addition, sharpness varies across the frame with the center normally being the sharpest area but corners can vary quite a bit depending on aperture and lens choice.

    Distance can also have an impact on sharpness. Most Nikon lenses are designed for moderate distances – not small subjects far away. When you get beyond 100-150 feet away, there is normally going to be some softness.

    AF accuracy may also impact testing. If you are trying to test your lens and camera, you need bright light and high contrast not low light conditions. Low light makes it harder to have enough light for good AF, and it also decreases apparent sharpness by reducing contrast.

    Judging diffraction solely by looking at sharpness is not necessarily the right approach. You’ll need to keep in mind the performance of a given lens. Some lenses are sharpest wide open or at wide apertures and deteriorate based on lens design even when diffraction is minimal. Still – personal testing is useful because it helps to understand how your gear works and where you consider the problems to be significant.

    Reply
    • Pete A
      April 4, 2016 at 5:58 pm

      Eric Bowles wrote: “Most Nikon lenses are designed for moderate distances – not small subjects far away. When you get beyond 100-150 feet away, there is normally going to be some softness.”

      Citation needed!

      Reply
      • Eric Bowles
        April 4, 2016 at 6:37 pm

        Nikon certainly won’t admit it, but lots of lens are like that. The Nikon 200-400 is one of the best known to be very sharp inside 150 feet but softer at long distances. The 600 f/4 is great at reasonable distances, but not at extreme distances. Even testing is done at closer distances – not distant subjects.

        Nikon does a nice job of making decision about how to balance different factors to get the best in the intended use. But most photographers that demand sharp images are not photographing dots in the distance – they are filling the frame or deciding how to get closer.

        Reply
        • Pete A
          April 5, 2016 at 10:24 am

          Yes, some lenses don’t perform their best at infinity focus. In the case of the Nikon AF-S NIKKOR 200-500mm f/5.6E ED VR it seems to be a quality control issue rather than a deliberate design criterion:
          photographylife.com/revie…m-f5-6e-vr

          The main things that limits the far distant image sharpness from 400+ mm FX lenses are: atmospheric refractive index gradients, such as heat shimmer; and atmospheric haze [suspended particles, including water vapour].

          From the above, it should be very obvious why lens tests conducted at a focus distance of hundreds or thousands of metres are completely meaningless. However, this does not mean that Nikon has a design policy that provides optimal performance at medium focus distances. I am therefore not surprised by the fact that, instead of backing your claims with citations, you wrote: “Nikon certainly won’t admit it, but…”

          I again apply Hitchens’s razor to your claims: “What can be asserted without evidence can be dismissed without evidence.”

          Reply
          • Samuel Flores Sanchez
            October 17, 2017 at 1:23 pm

            I think that I’m out of my range here because I’m not scientific, but I think that movement can be the cause for blurriness too in long distances. Little vibrations in the camera, even from the curtains, can be undetectable in short focus areas but definitely detectable in long distances.

            Reply
    • Nasim Mansurov
      April 4, 2016 at 6:13 pm

      Eric, just like Pete, I would seriously question the statement of Nikon lenses being designed for moderate distances. A lot of Nikkor lenses perform extremely well at infinity…

      Reply
  12. Aaron D. Priest
    April 4, 2016 at 6:37 pm

    Excellent article Spencer, you did a lot of research on this. Others have already mentioned a couple minor corrections that I noticed (Airy for example), so I’ll just give you a big thumbs up! :-)

    I have to deal with this a lot with spherical panoramas when there are objects close (about a foot) to several feet away. It’s always a balance between shooting more focus stacks or smaller apertures and trying to get optimum sharpness. It doesn’t help that the no-parallax point shifts when changing aperture and focus, so that too must be taken into account when finding the best compromise.

    Reply
    • Spencer Cox
      April 4, 2016 at 6:47 pm

      Thank you, Aaron! I can imagine how complex this topic (and hyperfocal distance) can get when you are making spherical panoramas. I didn’t know that the no-parallax point shifted at different aperture values — is that a result of focus shift?

      Reply
      • Pete A
        April 5, 2016 at 9:31 am

        Spencer, The perspective of a scene is determined entirely by the position and alignment of the lens entrance pupil within the 3D object space of the scene. The pivotal point for zero parallax error is therefore the central point of the lens entrance pupil.

        When using a traditional symmetrical lens that has a constant thin-lens model direct equivalent, the location of its entrance pupil relative to the front of the lens remains constant throughout its range of focus. All such lenses focus by altering their distance from their mounting flange. In other words, if we keep the entrance pupil in a fixed position then a closer focus distance requires that the camera and its image plane are moved further away from the lens. The focus shift with f-stop that’s caused by spherical aberration in the virtual entrance pupil is in the region of a millimetre or three in object space therefore the parallax error it causes is insignificant.

        Modern high-performance zooms, prime macro lenses, and prime telephoto lenses, do not have a constant thin-lens model equivalent. All lenses that use an internal focus mechanism — e.g., all Nikkor IF-ED lenses — vary their thin-lens model equivalent parameters throughout their focus range, especially the pivotal point of their entrance pupil. They also change their focal length, but this factor alone affects only the expected scene magnification ratio, it does not affect the perspective of the scene.

        When capturing panoramas in which the nearest object is hundreds of metres away, the distance between the camera tripod socket and the lens entrance pupil causes very little parallax error. Conversely, Aaron’s spherical panorama photography requires precise positioning of the lens entrance pupil in 3D object space.

        Reply
        • Nasim Mansurov
          April 5, 2016 at 10:17 am

          Pete, this warrants an article :) If you could put the above in an article, I would be happy to post it on PL. If you don’t have any images, please let me know what would be ideal to use and I will do my best!

          Reply
        • Aaron D. Priest
          April 5, 2016 at 10:20 am

          ^ Yeah… that… LOL! I’m not sure all the physics behind it, but that sounds legit to me Pete! Haha! I really only have this issue with very close objects in a confined space, such as an airplane cockpit, when focus stacking or shooting in 3D. For general landscapes at 6 feet and further away, the small amount of shift in the no parallax point due to re-focusing is less of an issue, as long as you are in the ballpark the stitching software handles the rest.

          Reply
        • Spencer Cox
          April 7, 2016 at 7:57 pm

          Thank you for clarifying, Pete — very interesting information! I (and many other PL readers) enjoy learning the fundamental reasons why these effects occur. It’s valuable to know the “why” along with the “what” for such complex topics.

          Reply
          • Pete A
            April 9, 2016 at 7:36 am

            Nasim and Spencer,

            I’ve spent the last three days preparing to start writing an article on the subject of perspective, lens entrance pupils, and how they apply to stitched panoramas.

            Thus far, I’ve managed to design a hypothetical rectilinear wide-angle retrofocus prime lens to provide the technical underpinnings for the article, and have produced a detailed diagram of it, based on the cardinal points of Gaussian optics.

            During this process I’ve learnt two interesting things: how to use the drawing application on my computer [that I didn’t know I’d installed!]; coming to the sudden realization that it is impossible to explain the role of the lens entrance pupil without firstly explaining the fundamental nature of light and human vision, which are two of the most important core principles in optical physics of which the vast majority of people are totally unaware.

            So, I’m wondering if the PL team would consider compiling a tutorial that addresses these core principles because it would serve as a rock solid basis [a reference point] for both answering readers’ questions and for underpinning the fascinating technical articles, such as Spencer’s What Is Diffraction in Photography? The answer to that question is: The lens entrance pupil, which is a virtual object that produces real images, therefore it is responsible for the diffraction; this virtual object represents an extremely important part of the highly complex interface between the domains of object space and image space. But, without me firstly having adequately described these very different domains, thereby putting my statement into its scientifically correct context, my simple answer sounds like complete gobbledegook — often misinterpreted as “mansplaining”, et cetera.

            I’m not able to write a suitable tutorial on PL for several reasons, but I’d be more than willing to provide suggestions and technical input to any/all of the team who would like to undertake this project.

            Kindest regards,
            Pete

            Reply
            • Merlin Marquardt
              April 9, 2016 at 9:51 am

              As far as I know/understand, diffraction is a physical, not a physiological, phenomenon.

              Reply
              • Pete A
                April 9, 2016 at 3:58 pm

                It’s both — in very bright light, the pupils in our eyes ‘stop down’ to an aperture small enough to noticeably restrict their resolution due to the increased level of diffraction.

                Reply
                • Merlin Marquardt
                  April 9, 2016 at 7:12 pm

                  Perhaps, but in any case the diffraction is a physical phenomenon.

                  Reply
                  • Pete A
                    April 10, 2016 at 3:14 am

                    Is there a physiological phenomenon that is not a physical phenomenon?

                    Reply
                    • Reasonable
                      April 10, 2016 at 3:50 am

                      Yes, for example some people perceive colors differently but it does not mean that colors are different.

                    • Pete A
                      April 10, 2016 at 7:54 am

                      Reasonable, Thanks for your reply. I’m wondering if various branches of science use the terms differently. My understanding is this: objective physical reality ⇒ our physiological sensory system ⇒ our psychological subjective interpretation, which results in our perception of physical reality.

                      “Perception (from the Latin perceptio, percipio) is the organization, identification, and interpretation of sensory information in order to represent and understand the environment.[1] All perception involves signals in the nervous system, which in turn result from physical or chemical stimulation of the sense organs.[2] For example, vision involves light striking the retina of the eye, smell is mediated by odor molecules, and hearing involves pressure waves. Perception is not the passive receipt of these signals, but is shaped by learning, memory, expectation, and attention.[3][4]

                      Perception can be split into two processes.[4] Firstly, processing sensory input, which transforms these low-level information to higher-level information (e.g., extracts shapes for object recognition). Secondly, processing which is connected with a person’s concepts and expectations (knowledge) and selective mechanisms (attention) that influence perception.”
                      en.wikipedia.org/wiki/Perception

                      A dream is an example of visual perception of things that do not exist in objective physical reality, however, dreams are the result of real physical activity in the physiological brain. Optical illusions are wonderful demonstrations of our inability to accurately perceive objective physical reality.

                      The physiological diffraction that occurs in our eyes reduces their angular resolution therefore we see the effects of it directly, in other words, our perception of it = its objective reality. The diffraction that occurs in photography is the same objective physical phenomenon, but we can see it only indirectly as a level of fuzziness that increases the closer we look at the image, therefore our perception of it does not equal its objective reality.

                      Depth of field is an example of something that consists entirely of physiological plus psychological perception because it doesn’t exist in objective physical reality: only the plane of focus is in focus; every object at a different distance is, by definition, out of focus. The circle of confusion is a physical phenomenon; depth of field is a subjective perceptual phenomenon.

                      It is very important to avoid making category mistakes in technical discussions on photography. Category mistakes, aka category errors, are ontological errors that I usually refer to by the term “domain errors”. E.g., diffraction and circle of confusion belong to the domain of objective physical reality, whereas their visible effects belong to the domain of each person’s visual perception. Words that end in “ness” indicate that they are in the latter domain. Here are some domain mappings from the physical ⇒ the perceptual, note that these are definitely not bi-directional mappings:

                      circle of confusion ⇒ fuzziness that varies with object space z distance
                      diffraction ⇒ overall fuzziness
                      far out-of-focus object rendition ⇒ bokeh
                      illuminance ⇒ brightness
                      quantization ⇒ smoothness; coarseness
                      signal-to-noise ratio ⇒ graininess; noisiness
                      system optical transfer function ⇒ sharpness

                    • Reasonable
                      April 10, 2016 at 10:31 am

                      Not sure what you do not understand from sentence I wrote, but the thing is simple! There are people who see green when others see brown, but they look at the same color (confirmed by any spectrophotometer in the lab for example). So this is a physiological phenomenon not physical phenomenon because it arises from different body built of their vision not because they look at different colors. The same applies to diffraction and other physical properties of light & matter. You know that human perception is subjective right? So there will always be physiological variation in how we perceive physical properties which stay on the other hand. This is why we use standardized detectors in the lab and do not relay on human detectors anymore, as in past centuries ;) Let me quote Einstein here:

                      “Science should be made as simple as possible but not simpler”

                    • Merlin Marquardt
                      April 10, 2016 at 12:43 pm

                      Well, I would agree all physiological phenomena have physical phenomenological bases, but diffraction is a physical phenomenon only. Diffraction due to an extremely small pupil is still just a physical phenomenon. The constriction of the pupil is a physiogical phenomenon, and the perception of the effects of diffraction may be physiological, but the diffraction itself is a physical phenomenon.

                    • Pete A
                      April 11, 2016 at 3:29 am

                      Merlin, the optical functions of an entrance pupil and a variable magnification lens, and the resulting diffraction, are all physical phenomenon. Eyes are a physiological implementation of an imaging system, cameras are a technological implementation of the system. Neither system causes diffraction per se, diffraction is just waves doing what waves do when they encounter an obstacle — a circular aperture in this case. Likewise for the refraction that waves undergo when there is a change in refractive index, and for the photon energy that is transferred to the sensor.

                      Whether the aperture and magnification of the lens are changed by muscles (physiologically) or by servo motors (technologically) is irrelevant because they are just essential mechanical components in the complex servo systems that provide the two functions auto-exposure and autofocus. In manual camera system these functions are provided physiologically by its operator so one could argue that a manual imaging system is a physiological only imaging system because it cannot function using the natural laws of physics alone. Machines that are designed to extend the functionality of their human operators need an ergonomically designed control layout in order to provide the most seamless intuitive interface between human and machine. Having a camera with a superb imaging system but a confusing menu system and/or control layout is a gripe that’s been mentioned several times in articles on this website.

                      I don’t think making a distinction between physical and physiological phenomena is useful, however, making a clear distinction between objective physical phenomena and their resulting psychological subjective effects is essential to both art and photography. As I’ve mentioned before, a small photo viewed from afar will have an infinite depth of field and zero perceivable diffraction; the same image when pixel-peeped will have a shallow depth of field and/or clearly visible diffraction. Obviously, the resulting subjective effects of the underlying objective physical phenomena depend entirely on the angle of view that the image presents to the viewer and the visual acuity of the viewer. Photographers need to take into account the viewing conditions of their intended audience before pressing the shutter button. Following a rule, such as f/16 will produce easily noticeable diffraction on a 30+ MP full-frame sensor, is useful only if their intended audience consists entirely of pixel-peepers.

                    • Reasonable
                      April 11, 2016 at 12:26 pm

                      I don’t mean to offend you Pete A, but gosh, I hope you don’t teach students. Not because you don’t have necessary knowledge, which I am sure you have, but man this is not going to engage young people…

  13. Thorben Doehl
    April 5, 2016 at 8:38 am

    Excellent articel, thanks a lot! Maybe you could add some words about the influence of print size and viewing distance, I guess many people do care about diffraction but are using screens, TVs etc. to display the pictures

    Reply
    • Spencer Cox
      April 7, 2016 at 7:49 pm

      Thank you for the comments! Very true — if you are using a TV or small prints to display your photos, diffraction becomes much less relevant. It still exists, of course, but it is harder to notice at any reasonable aperture settings.

      For large prints, though, people tend to enjoy examining them for every possible detail. Viewing difference certainly matters, but only if you can be sure that no one will view it from a closer vantage point (for instance, a billboard ad).

      Reply
  14. Merlin Marquardt
    April 11, 2016 at 8:02 am

    Agree, more or less.

    Reply
  15. Jon Middleton
    March 5, 2017 at 12:25 pm

    Is diffraction really only a function of relative aperture, ie, f/stop, or a function of the absolute size of the aperture? It seems that diffraction is caused by the bending of light around the edges of the aperture, and increases as the aperture size is reduced. This explanation would seem to be supported by the fact that with smaller sensors, diffraction seems to become more problematic at larger apertures. On my view camera, my 210mm lens doesn’t show much diffraction at f/11, even f/22 or higher. At f/11 on my Sony RX100MV, it’s pronounced at 24mm (35mm equivalent, actually 8.8mm). In other words, it seems like diffraction is a function of the absolute aperture size in mm and the wavelengths of light, which are constant.

    Reply
    • Spencer Cox
      March 5, 2017 at 11:41 pm

      Your interpretation is incorrect. The amount of diffraction that you see in your photos is a function of the f-stop (the focal length divided by the aperture diameter), not the physical diameter of the aperture blades in your lens. That’s why you’ll see the same amount of pixel-level diffraction at a given aperture whether you use a 14mm lens or a 500mm lens, assuming the same sensor size.

      So, why aren’t you seeing much diffraction with your view camera, even at apertures like f/22? Look at the first figure under Section 4, “Small Versus Large Sensors” to see the reason. Essentially, at a given output size, prints from small sensors will show diffraction much more readily than from large sensors.

      Hope that helps!

      Reply
      • Jon Middleton
        March 6, 2017 at 8:10 am

        Thanks, Spencer, makes sense. So, is there a table somewhere that shows at what aperture diffraction becomes a problem for different sensor sizes at reasonable output sizes or maybe enlargement ratios? That’s what’s really important, there should be a graph somewhere by now.

        Reply
      • Jon Middleton
        March 6, 2017 at 10:34 am

        So, looking at lens performance for a minute. I was reading about my Nikon 90mm f/8 SW large format lens, and peak performance appears to be at f/16 in this test.

        www.arnecroell.com/lenstests.pdf

        But higher at f/11 in this one:

        www.hevanet.com/cpere…sting.html

        I don’t know enough about the methodologies to know which is correct. Diffraction would favor f/11

        Reply
        • Spencer Cox
          March 7, 2017 at 7:50 pm

          There isn’t a table to show the onset of problematic diffraction, since it’s a very subjective thing. Technically, I start to see some minor effects of diffraction even at apertures like f/5.6 on my Nikon D800. I have no problem using an aperture of f/16, but other photographers will draw the line at f/11, f/22, or something else entirely. (These numbers are for using an FX sensor size.)

          As for the two testing methodologies, lenses overall vary quite a bit from copy to copy, which could explain the result. There is no perfect lens test, either, and that could be at fault as well. However, in practice, the difference between f/16 and f/11 is likely to be quite small, and not something you will easily notice in the real world. Diffraction does favor the f/11 shot, but the f/16 shot will decrease unsharpness problems that the lens itself has, so it’s a tradeoff. With a large format camera and a 90mm lens, something like f/11 will have a very thin depth of field, so you may be better off sticking to f/16 for certain images (or even something smaller like f/22 or f/32). However, it is best if you test this for yourself.

          Reply
  16. Jon Middleton
    March 7, 2017 at 9:39 pm

    I actually found a diffraction calculator, which was very interesting, here:

    www.cambridgeincolour.com/tutor…graphy.htm

    It appears to be complicated, as seeing diffraction depends on output size, viewing distance, pixel size, etc. Interesting subject, I’ll be doing some testing of my gear. The largest jpeg files from my RX10MV appear to be at f/5.6, but I need to do some more rigorous testing. I have a several 20″x 24″ prints from my Zone VI with a Rodenstock Sironar N 210mm, which isn’t supposed to be a great lens, but they are very sharp corner to corner. I think they were shot at between f/22 and f/32, so go figure.

    Reply
    • Jon Middleton
      March 7, 2017 at 9:41 pm

      RX100MV, not 10.

      Reply
  17. Doug
    March 25, 2017 at 9:39 am

    Well, I was just on flickr and a pro photographer of landscapes was consistently using apertures greater than f11. The images looked very sharp. I was looking for deterioration of the image due to diffraction but to my eyes they looked pretty good. I would imagine focus stacking could give similar effects. The images made me wonder just how much the images actually suffered from f16 and f22. They looked good to me. They were mostly taken at f16.

    Reply
    • Aaron D. Priest
      March 25, 2017 at 9:57 am

      It greatly depends on pixel density and size of your prints too. A Nikon D810 or Sony a7R II at 36 or 42 megapixels is going to show diffraction sooner than a 12 megapixel Nikon D700 or Sony a7S. And viewing an image online at 1024 to 2048 pixels wide doesn’t show the detail that a 20 x 30 print on the wall would show either. So, with anything in photography, it’s relative. :-P I don’t go further than f/13 on my D810 when shooting gigapans because it has a high pixel density and it is going to be printed large (several feet quite often). f/16 actually loses considerable sharpness under those conditions.

      Reply
  18. David Powell
    April 20, 2017 at 2:56 am

    Thank you for an excellent article, pitched at just about the perfect level for me and, I suspect, for a lot of readers. I briefly encountered the mathematics of optics at college, many years ago, and have no desire to revisit ! I particularly appreciated your advice on coming to terms with the problem and not overreacting. The illustrations were particularly helpful in getting a feel for the scope of the problem. I’m now informed, and calm too…
    One question: I’m just taking up photography again after a long interlude. Back in the seventies and eighties I used to read photographic magazines quite regularly but don’t recall any discussions of diffraction. It was always small aperture = depth of field and/or getting the best out of budget lenses. Obviously diffraction would have been occurring then, exactly as it does today, and of course lens designers would have been aware of the possibility, but there didn’t seem to be any public concern. It’s entirely possible that my reading material was at too low a level, or that I missed the issue with the shocking revelation, but I am left wondering if diffraction might have less impact on film than on digital sensors for some reason. Any thoughts?

    Reply
    • Spencer Cox
      April 20, 2017 at 3:19 am

      Glad you enjoyed the article. In general, diffraction is associated less with film photography simply because, by area, film doesn’t have the same resolving power that digital camera sensors do. In terms of diffraction, the difference between f/8 and f/16 was — on many films — mostly obscured by the grain. Even today, shooting at f/22 with a high-resolution digital body, I can’t see any differences at all when the image is well-sharpened and printed at 8×12 inches or smaller. You need a camera with very good resolving power and a relatively large print (or crop) to see diffraction at all but the most insane apertures.

      People still knew about diffraction in the 70s and 80s, and I’m sure that some photographers avoided extremely small apertures for that very reason, but it simply wasn’t as obvious as it is on today’s cameras.

      Reply
  19. Jon Middleton
    April 20, 2017 at 9:09 am

    I guess one make a choice between softness due to diffraction and increased depth of field. Also, enlargement ratios come into play. Maybe I’ll shoot a series of photos with my view camera, say f/11, f/16 and f/22 to see which is best.

    Reply
  20. Samuel Flores Sanchez
    October 17, 2017 at 1:06 pm

    Hi.
    What do you mean with “If you want to test deconvolution sharpening, increase the “Detail” slider as much as possible in either Lightroom or Camera Raw”. How that works?
    Thanks!

    Reply
    • Spencer Cox
      October 22, 2017 at 2:45 pm

      Adobe changes around their sharpening algorithm depending upon your “Detail” setting. So, with detail +100, the algorithm is as close as possible to deconvolution sharpening, which tends to be ideal for fixing areas with diffraction. Personally, my go-to sharpening settings are +30 sharpening, 0.5 radius, 100 detail, and 10 masking (although this may change from photo to photo).

      Reply
  21. Polaris
    January 24, 2018 at 6:32 am

    thanks for this comprehensive article and nice photos, esp. the last one …
    why lens manufacturers do not add a mark on the lens to indicate the best aperture for which the lens is optimized ??

    Reply
    • Spencer Cox
      January 26, 2018 at 3:12 pm

      Although they could do something like that, the simple answer is that the “best aperture” can change. Sometimes, a lens is sharpest in the center at f/4, but sharpest in the corners at f/5.6. Other times, a lens’s optimal aperture will change as you zoom in (or even focus more closely versus farther away). There’s also the danger that people would see the “optimal” value and try to use it for every photo, when in reality sharpness is a much smaller consideration than other aspects like depth of field and exposure. Also, it keeps sites like Photography Life in business to be able to do lens testing and figure out this stuff for ourselves :)

      Reply
  22. Ellie Navarro
    February 7, 2018 at 3:53 pm

    This is an excellent article. Thanks for not dumbing it down too much for us. :) I tested today and am thrilled to have the info. My Asheville Mtns are going to look so much better!

    Reply
  23. Tom McIntyre
    February 22, 2018 at 6:43 am

    I have taken landscape photos for years that had great depth of field but lacked tact sharp details. Now I know why! F22 is not always best! Going to have fun making comparison shots with my particular equip to see what apertures work best for me! I am learning that everything in photography, like in life seems based on making the best compromises…….Thank you for a great article!

    Reply
  24. John
    November 22, 2018 at 3:01 am

    Great article!
    I think there is a typo in this sentence (section 6):
    “However, at wide apertures like f/2.8 or f/4, the Airy disk is much smaller than the pixels in your photograph. This means that diffraction is essentially impossible to see at such small apertures.”
    I think the end should be “at such large apertures”?

    Reply
    • Spencer Cox
      December 24, 2018 at 2:15 am

      Great spot, John, thank you! I’m surprised that it slipped by for so long, but very glad you caught it.

      Reply
  25. Barry Scully
    January 17, 2019 at 7:14 pm

    Do you have any links for the equations and theory for how diffraction affects resolution? I have the basics (three years of optics in university) but am interested in learning more about multi-lens optical physics.

    Thanks.

    Reply

Comment Policy: Although our team at Photography Life encourages all readers to actively participate in discussions, we reserve the right to delete / modify any content that does not comply with our Code of Conduct, or do not meet the high editorial standards of the published material.

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Categories

  • Composition and Art
  • Essays and Inspiration
  • Photography Techniques
  • Photography Tutorials
  • Post-Processing
  • News
  • Reviews

Photography Tutorials

Camera BasicsPhotography Basics
Landscape PhotographyLandscape Photography
Wildlife PhotographyWildlife Photography
Macro PhotographyMacro Photography
Composition & CreativityComposition & Creativity
Black & White PhotographyBlack & White Photography
Night Sky PhotographyNight Sky Photography
Portrait PhotographyPortrait Photography
Street PhotographyStreet Photography
Advanced PhotographyAdvanced Photography

Recent Topics

  • Sigma 18-35 f1.8 for close – macro
  • Z7 viewfinder on computer
  • Stolen photo
  • Sensor Gel Stick and Nikon Z Cameras
  • 2019 Death Valley Group Images
  • Looking for advice on a possibly defective lens.
  • Sigma 17-50 f2.8 OS ?
  • First snow in Banff; thoughts?
  • A Pro Photographer's Musings on Nikon mirrorless
  • autumn waterfall

Footer

Site Menu

  • Photography Tips
  • Forum
  • Lens Database
  • Photo Spots
  • Search
  • Submit Content
  • Subscribe
This site uses cookies to improve your experience. To find out more, please see our Privacy Policy

Reviews

  • Camera Reviews
  • Lens Reviews
  • Other Gear Reviews
Copyright © 2019 Photography Life
DMCA.com Protection Status