oldgit
Suspended / Banned
- Messages
- 8,346
- Edit My Images
- Yes
Something I've been mulling over for a while and not been able to find a satisfactory answer to is...
I know that the ISO rating of a digital sensor is it's correspondance to the equivalent film speed and that higher ISO means more noise that kinda equates to higher digital noise.
All sensible straightforward and easy to understand, but...
What is happening inside the camera to adjust the sensor sensitivity?
Clearly it's the same sensor so there is some sort of jiggery pokery going on, but,
What:
- Is it a different pre-charge on the sensor drive lines?
- Less time allowed on the A-D convertors?
- Different back bias or bulk node voltage?
.... how does the thing work?
I'm also making the assumption that the sensor is "equivalent" to a RAM without the lid on, and therefore there are row and column drivers/decoders..
I've tried googling for "CMOS Image Sensors" "FET image sensors" etc, but have not found out a decent reference.
Anyone help?
I know that the ISO rating of a digital sensor is it's correspondance to the equivalent film speed and that higher ISO means more noise that kinda equates to higher digital noise.
All sensible straightforward and easy to understand, but...
What is happening inside the camera to adjust the sensor sensitivity?
Clearly it's the same sensor so there is some sort of jiggery pokery going on, but,
What:
- Is it a different pre-charge on the sensor drive lines?
- Less time allowed on the A-D convertors?
- Different back bias or bulk node voltage?
.... how does the thing work?
I'm also making the assumption that the sensor is "equivalent" to a RAM without the lid on, and therefore there are row and column drivers/decoders..
I've tried googling for "CMOS Image Sensors" "FET image sensors" etc, but have not found out a decent reference.
Anyone help?
