No, we haven't just replaced silver halide crystals for pixels...
The reasons are different, the net result is typically the same.
Yes, the net result is the same because the pixels have little bearing on this issue.
Silver halide crystals are the same size regardless of the size of the negative (for a given film type). With a larger negative you have more of them and they need less enlargement for a given print size, thus "sharper."
You're talking about why larger format film has less grain here... not why it's sharper. Grain size, or pixel size has nothing to do with why larger film/sensors are better.
Smaller pixels are less efficient at gathering light, start diffracting much earlier, and require more resolving power from the lens... all of which results in lower contrast and thus the same net effect and less "sharpness."
Smaller SENSORS do that, yes... not pixels.
Taken to silly extremes, yes but an image from a 16MP FX sensor will appear sharper than one from a 16MP APS-C sensor, even though it's pixels are larger. It is still less enlargement required that makes larger sensors better.
Let me explain: I'll keep maths and science to a minimum for the benefit of all readers, but this is how it works.
- All lenses have a limit to the size of object or detail that they can resolve. This is known as the circle of confusion. For most digital SLR lenses let's assume for the same of this argument that it's around 0.03mm.
Assuming you use the same lens on a APS-C crop sensor camera like a Nikon D7000 and a full frame camera like the D800, it will still resolve detail down to a minimum of 0.03mm regardless of which camera it is on as it's a fixed property of the lens.
- However, 0.03mm is smaller in proportion to the area of a full frame sensor than it is a crop sensor.
- 0.03mm is 0.12% of the total image width of a APS-C sensor (23.5mm x 16mm)
- 0.03mm is 0.08% of the total image width of a full frame sensor (36mm x 24mm)
In other words, the blurriness caused by lens defects is 33% less (or smaller) on a full frame camera compared to a crop sensor camera, regardless of it's resolution.
Again assuming we could use exactly the same lens on a 5 x 4 inch camera (I know you can't befoe anyone points this out), the percentage of image width taken up by the circle of confusion would be 0.02%
This is a
84% decrease in circle of confusion size compared to the whole image area compared to a APS-C crop sensor.
Apparent sharpness can therefore be said to be a product of sensor size.
How visible the aliasing (pixels) will be in print can therefore be said to be a product of the image resolution.
A combination of large sensor and high resolution is best,
but a 16MP image on a small sensor camera will be visibly less sharp than a 16MP image from a larger sensor camera despite the aliasing being identical.
Identical sized prints from files for comparison
Sharpness at single pixel level.
16MP D800 images are not taken in DX crop mode, but are resized FX images. All I've done is made the pixels bigger. A 16MP FX sensor would look pretty much the same as the resized D800 image.
The resolution is identical, yet sharpness is greater from the FX image. Fact.
This is why the push for greater and greater resolution from sensors is pointless now. With the D800 we've hit a limit set by the lenses (for 35mm it's actually around 24MP)... not the sensor. If the D4X has greater than 36MP when it arrives (if it arrives) it will be utterly stupid and Nikon just pandering to people like you who feel greater pixels mean better images. If you want more sharpness now, you either optimise lens design more (as MFT has done), or move up to medium format digital. There's no more to be done. More pixels have b****r all to do with it.