Full frame & diffraction

I'm not, I'm describing fourier optics through self-diffraction at apertures. The lens defines the wavefront as curved, creating the focus. The diffraction then occurs through the self interference of the wavefront as it propagates to the focus. This image formation theory explains why the rays from the periphery increase the resolution if the system. The edges are not, repeat, not the source of diffraction. The spatial extent of the wavefront is. This is not me making stuff up, this is part of how I design, build and operate adaptive optics compensated super-resolution microscopes. Go back and read my earlier posts, you might learn something.

Fourier optics describes things very differntly to classical optics. And in away not exactly understandable to a lay man with out the necessary mathematics and concepts.
The classical view is used in most explanations to do with telescopes, and camera optics.
I am in no position to judge why both seem to give adaquate answers in their own fields.
Sixty odd years ago, light was seen as a wave in the eather. To day we have some understanding about the behaviour of photons. The various theories explain why and how they move as they do. I rather doubt our understanding is complete

It would seem that working with microscope optics is involved with a whole different level, in terms of the various forms of diffraction and no doubt extremes of lenses resolution too. And probably quantum optics along side fourier.
 
Fourier optic is a way to describe how wave fronts move through various slits, gaps, apertures. lenses or mirrors of various kinds.
As such what you are saying is correct in describing what happens. during those transformations.

However, from a practical or photographic point of view, we are not so much concerned with the mathematical theory describing the paths of the light waves, as they approach and pass through the gap formed by an aperture. or their transformations as they passes through the lens system.
We are more concerned by the fact that the aperture we use, defines the limits of the shape of that gap. Which I define as the edges of the aperture.
A gap undefined by edges is no gap at all.

The shape and size of the Aperture we use provides many useful photographic functions. Including, depth of field, bokeh and to changes to the values of various aberrations. It also defines the extent of the effect of Diffraction.

I am happy that you are able to calculate what transpires when a wave front passes through that gap. But I would assert that the limits of that gap (edges), inform the calculations you make, as do any insertions of any nature into that gap.
 
.

I am happy that you are able to calculate what transpires when a wave front passes through that gap. But I would assert that the limits of that gap (edges), inform the calculations you make, as do any insertions of any nature into that gap.
He/she said that, didn't they? They said the aperture was important because it defined the limits on the angles of the rays that pass through it. So a small aperture will physically block rays coming in at narrow angles. Obviously the limits of the gap are therefore important for any calculation.
I think what your antagonist was saying, though, was that the lens is itself creating a curved wavefront (it has to) which will obviously then produce self-interference and diffraction. And this is a bigger contribution to diffraction than the "slit effect". You were arguing that the lens had nothing to do with it whatsoever. That's how I understood it. Could well be wrong. Optics isn't my strong point.

Fascinating discussion by the way.
 
He/she said that, didn't they? They said the aperture was important because it defined the limits on the angles of the rays that pass through it. So a small aperture will physically block rays coming in at narrow angles. Obviously the limits of the gap are therefore important for any calculation.
I think what your antagonist was saying, though, was that the lens is itself creating a curved wavefront (it has to) which will obviously then produce self-interference and diffraction. And this is a bigger contribution to diffraction than the "slit effect". You were arguing that the lens had nothing to do with it whatsoever. That's how I understood it. Could well be wrong. Optics isn't my strong point.

Fascinating discussion by the way.

Yes I did say tht it was not caused by the lens. A pinhole suffers in the same way from diffraction as does a diffraction grating.
both have gaps (edges) that define the gap.
However a lens would effect changes to the path of that diffracted light. And it follows to the airy disc.
A lens does not cause diffraction it is inherent to every wave front.

As photographers the only variable we can control is the aperture.
We need not consider the lens construction as that is outside our control.
Though if we were to place an additional aperture in front of the lens it would change the dynamics but not the inevitability of diffraction.

Historically the major difference between a Leitz elmar and a Zeiss tessar was the position of the diaphragm between the elements.
 
I'm contemplating moving from m4/3 to a FF system for a large number of reasons including the ability to make larger prints.
One of the issues with m4/3 is that diffraction starts to be noticeable at f11 or even wider if you go looking for it.
At what point does it become an issue with a 24mpx FF sensor? or a 36mpx sensor?
Does removing the anti-aliasing filter help?

This question raises a number of complex issues, but many replies seem to have rather missed the point!

FF will give you sharper images than a smaller format because the lens doesn't have to work so hard on resolution and can therefore deliver greater image contrast (basic lens MTF [Modulation Transfer Function] theory). This is the main reason why larger formats are sharper and pixel count doesn't have that much to do with it (within reason) and neither does diffraction at low and mid-range f/numbers. If a certain level of detail requires 30-lines-per-mm resolution on FF, on M4/3 the lens has to work at 60-lpmm (the difference is the crop factor, 2x30=60) and that's a big ask.

Diffraction affects higher resolution first, therefore smaller formats are hit sooner and harder. In practical terms, if f/11 delivers an acceptable standard on FF, then you'll be looking at around f/5.6 on M4/3 (though it's not actually that simple). Diffraction is an optical characteristic and unrelated to pixel count. Those on-line calculators that relate diffraction to f/number and pixel density are a) only a theory, and b) at least 50% wrong as they only consider resolution and ignore image contrast that actually has a more significant impact on our impression of sharpness.

All things equal (and they rarely are) a 36mp sensor will deliver sharper images than 24mp, though you'll have to look close. Removing the AA filter also increases sharpness, at the risk of moire. The risk of moire reduces with pixel density.
 
This question raises a number of complex issues, but many replies seem to have rather missed the point!

FF will give you sharper images than a smaller format because the lens doesn't have to work so hard on resolution and can therefore deliver greater image contrast (basic lens MTF [Modulation Transfer Function] theory). This is the main reason why larger formats are sharper and pixel count doesn't have that much to do with it (within reason) and neither does diffraction at low and mid-range f/numbers. If a certain level of detail requires 30-lines-per-mm resolution on FF, on M4/3 the lens has to work at 60-lpmm (the difference is the crop factor, 2x30=60) and that's a big ask.

Diffraction affects higher resolution first, therefore smaller formats are hit sooner and harder. In practical terms, if f/11 delivers an acceptable standard on FF, then you'll be looking at around f/5.6 on M4/3 (though it's not actually that simple). Diffraction is an optical characteristic and unrelated to pixel count. Those on-line calculators that relate diffraction to f/number and pixel density are a) only a theory, and b) at least 50% wrong as they only consider resolution and ignore image contrast that actually has a more significant impact on our impression of sharpness.

All things equal (and they rarely are) a 36mp sensor will deliver sharper images than 24mp, though you'll have to look close. Removing the AA filter also increases sharpness, at the risk of moire. The risk of moire reduces with pixel density.

I'd like to take thank everyone who's taken the time to contribute to this discussion. I've learned a bit, and more importantly thought a bit more!

Perhaps I could repose the question:

What are the real-world downsides of going for a 36 mpix FF sensor with no AA filter compared to a 24 pix FF sensor with an anti-aliasing filter (other than storage & processing requirements)?
 
I'd like to take thank everyone who's taken the time to contribute to this discussion. I've learned a bit, and more importantly thought a bit more!

Perhaps I could repose the question:

What are the real-world downsides of going for a 36 mpix FF sensor with no AA filter compared to a 24 pix FF sensor with an anti-aliasing filter (other than storage & processing requirements)?

Only upsides, apart from increased risk of moire. But don't expect a massive improvement in image quality; a 50% increase in pixels sounds a lot, but it's not. The difference is slight, needs the best lenses and very careful technique to realise it, and is only visible in very big outputs.
 
Some good info here

Yes - if the info in that article is correct then this part is particularly significant:-

"NOTES ON REAL-WORLD USE IN PHOTOGRAPHY
Even when a camera system is near or just past its diffraction limit, other factors such as focus accuracy, motion blur and imperfect lenses are likely to be more significant. Diffraction therefore limits total sharpness only when using a sturdy tripod, mirror lock-up and a very high quality lens."
 
There is enough stuff on the web about it, to keep anyone reading for a lifetime.

Very high pixel counts, in them selves, can reduce the possibility of moire effects. (the theory is boring)

It has been the normal practice to add an anti aliasing filter, in order to spread light over the four pixel bayer pattern so reducing this possibility.
This was certainly needed on the original low pixel densities found on early cameras. It was far from unknown for moire effects to be seen on tiled roof tops and mosaic tiles on modern buildings. or any other repeating patterns.

However an anti-aliasing filter is not the only way to avid these effects. My own little Fuji X20 with a 2/3 sensor of 12 mps has no filter, but relies on a unique Fuji colour array on an otherwise standard Sony sensor. After nearly three years of regular use I have never suffered from Moire.
Recently Pentax introduced a switchable solution to the problem, by using micro vibrations of the sensor to spread the light over four pixels.

Many other cameras simply take the small risk and do not include a filter at all. As Moire effects are quite rare in any event, and it is easy to remove them during post processing, the need for such a filter is questionable on professional level cameras. The gains achieved on more than 99% of shots are more than compensation for the minor trouble of dealing with any that suffer from Moire.

the difference between FF camera with or with out an AA filter as you describe, in terms of artefacts is negligible.
The difference between any 24 and 36mpx except for massive prints is almost indistinguishable, as is the difference between a correctly sharpened shot with AA filter or with none.
However in Ideal conditions and with perfect technique, that difference is measurable.

It would seem, over recent years, that a growing number of photographers like to have, at least the possibility, of higher quality results in their kit bag.

The choice between a low density pixel camera and a high density one, can often be for quite different reasons. Probably the main factor being that Larger but lower density pixels are likely to produce greater detail in shadows and a greater Dynamic range, especially at high ISO settings. They are often the preferred choice for sports photographers or those requiring the maximum dynamic range at all settings, like wedding photographers.

On the other hand architectural or landscape photographers usually go for cameras with the maximum pixel counts.
 
I think its rare a photo is ever runied by diffraction, other elements are so much easier to mess up ;)

That said I have seen some very soft landscape shots taken at F22 on a crop body that were noticably poor
 
I shoot F16 reasonably often with the D800, F22 occasionally if needed and regularly print 30x20inch.. I can't remember when I thought diffraction had caused any issues and certainly haven't seen anything like "mush" - big prints aren't supposed to be viewed from 3 inches away anyway. It's a scientific principal that's obviously there but I wonder whether it's a bit overemphasised in it's real world effects. IMHO, nothing to be concerned about.. lens choice and technique are likely to have far more influence

Just my experience

Simon
 
For clarity (haha!) diffraction affects the whole of the image, not just around the edges.


Absolutely ...however there is a vast difference between small gaps nearing that of a wavelength of light and lage ones of several centimetres , where the proportion of diffracted light is so small as to be unnoticed.
 
For clarity (haha!) diffraction affects the whole of the image, not just around the edges.

I wonder if some confusion is being caused by using the word "edges" in these diffraction discussions in two or maybe even three different sernses? Did you mean "edges of the image" here? If so I agree, diffraction affects the entire image uniformly. On the other hand the reproduction of sharp high contrast edges in the image are where the effects of diffraction are most visually obvious. And diffraction is caused by the passage of light rays past an edge, such as the edge of the iris in a lens, or when the iris is wide open the edges of the lens.
 
I wonder if some confusion is being caused by using the word "edges" in these diffraction discussions in two or maybe even three different sernses? Did you mean "edges of the image" here? If so I agree, diffraction affects the entire image uniformly. On the other hand the reproduction of sharp high contrast edges in the image are where the effects of diffraction are most visually obvious. And diffraction is caused by the passage of light rays past an edge, such as the edge of the iris in a lens, or when the iris is wide open the edges of the lens.

Yes. I just thought some people might get mislead - easily done.
 
One of the good things about digital is that you can experiment and view the results as soon as you can load them onto your PC. All a person has to do is run off some test shots and if any image degradation caused by smaller apertures is objectionable then it's lesson learned. If the results are acceptable after a little post capture tweaking (maybe some added contrast will help...) then even better :D

Personally I try to keep away from smaller apertures but if I do feel the need I'll use them and indeed I've used f16 with my MFT cameras and the images look ok to me after processing. They may not be exhibition quality but for on screen and small prints they're just fine :D
 
One of the good things about digital is that you can experiment and view the results as soon as you can load them onto your PC. All a person has to do is run off some test shots and if any image degradation caused by smaller apertures is objectionable then it's lesson learned. If the results are acceptable after a little post capture tweaking (maybe some added contrast will help...) then even better :D

Unless you don't actually have the camera yet :)

Personally I try to keep away from smaller apertures but if I do feel the need I'll use them and indeed I've used f16 with my MFT cameras and the images look ok to me after processing. They may not be exhibition quality but for on screen and small prints they're just fine :D

I'm ambitious; I'm after exhibition quality! (Yes, I know, the thing which will make the biggest difference is my own ability, blah, etc, blah)
 
An interesting thing about diffraction is that it is age related.


Now that I hve turned 80 none of my shots display any noticeable diffraction at all.
however they display little acute sharpness either.
 
Back
Top