Perhaps I'm missing something here, and it's more of a question than any disagreement? (not aimed at your post Woodsy, but you do raise the point) ... I can't understand how light is actually a factor at all in the 'real' noise, as opposed to perceived or 'visible' noise in any shot...? Noise is an electrical issue with the sensor ..

The photosites have a capacity for electons, i.e. counting the number of hits, which is simply numbers! Noise is a failure within the cmos / ccd or whatever to accurately record the photon count ... multiply this by 4 and the colours can go seriously wacky! ... I have always believed that the smaller the photosites the lesser the capacity for accurate image definition .... I agree thatbetter glass can give the illusion of less noise, but that is all it is ... an illusion ...... but at the end of the day the image is what we see, not the numbers
That's the thing, light is not part of it, and hence my whole argument that the lens has nothing to do with the 'visible' or 'real' (which I would say are the same thing? - after all, we only see it as visible, so that would be real no?). Yes, you're absolutely right, the sensor is crudely speaking just a photon counter, with each potential well having a "capacity" for electron energy states. The low end above the - I shall call it 'noise level' - and the high end below highlight is the dynamic range, in a manner of speaking. So it's crucial to understand what noise is. It's simply the general term given to "signal" that is not part of the original information received. So when the sensor is active, it's collecting photons of any wavelength that will excite energy levels within the wells, which includes those given off as heat whilst the sensor heats up. This heightens the noise level at the sensor stage, and lessens the difference between the signal and noise levels. Signal in this case ideally meaning the photons in the visible spectrum that we see that ultimately make up the desired image.
So far, this has all been sensor based. We need now look at the electronics that actually amplify the information. Remembering of course that it amplifies the signal AND the noise. Here we see all sorts of noise introduced, again in the form of heat, but also in things like reflections in the electronic tracks as differences in impedance occur.
Once this has all been amplified up to the desired "exposure", we see the results in the form of the image.
My point in the post above was to illustrate how, if the amount of light - or signal - is the same from both lenses, there will be no difference in the amount of noise. After all, how can there be? If the signal is the same, then the exposure time is the same, which means the sensor is active for the same time, and thus the amount of noise must be the same at the sensor stage. So personally, I see no physical reason why the lens should alter the amount of noise we see on the final image. I quite agree that it will help with image sharpness, and further I also agree that smaller sensors, or at least sensors with a higher pixel density, will have a lesser ability to capture definition. But this is due to diffraction limitations. Sharper lenses definitely help in this situation.