XEyedBear
Suspended / Banned
- Messages
- 12
- Name
- Tony
- Edit My Images
- Yes
I'm currently reading 'Camera RAW for Digital Photographers Only' by Rob Sheppard - a book which I can with all sincerity thoroughly recommend that you ignore (that's another forum, I guess). He talks (God, does he talk..) about Histograms. If you have a copy of this book, look at page 26. Three histograms are shown of which the top 2 have intrigued me.
These two histograms show the data from two reasonable well exposed images. There is no clipping and no extreme, or dominating, peaks. The area 'below the curve' in one histogram occupies about 70% of the available graph area, the other occupies only 30%.
Assuming these are 2 images from the same sensor, how can this be?
If I understand them correctly, in a histogram each point on the x-axis represents one of the 'quantisation' levels that each sensor (or it group of 4 sensors: GRGB ?) is capable of measuring. I guess there are 2^8 quantisation levels for each sensor for a total of 24 bits of colour tone (assuming only 1 of the 2 'green' sensors in each pixel is used for this purpose).
The Y-axis is a measure of the number of sensors that have measured each of these 2^8 quantisation levels. So the total of all the y-axis readings should equal the total number of sensors points on the light measuring device.
And therefore for 2 images from the same sensor the sum (or integral) of the y-axis readings should be constant. But for the two figures in this book they are demonstrably different.
What have I misunderstood?
These two histograms show the data from two reasonable well exposed images. There is no clipping and no extreme, or dominating, peaks. The area 'below the curve' in one histogram occupies about 70% of the available graph area, the other occupies only 30%.
Assuming these are 2 images from the same sensor, how can this be?
If I understand them correctly, in a histogram each point on the x-axis represents one of the 'quantisation' levels that each sensor (or it group of 4 sensors: GRGB ?) is capable of measuring. I guess there are 2^8 quantisation levels for each sensor for a total of 24 bits of colour tone (assuming only 1 of the 2 'green' sensors in each pixel is used for this purpose).
The Y-axis is a measure of the number of sensors that have measured each of these 2^8 quantisation levels. So the total of all the y-axis readings should equal the total number of sensors points on the light measuring device.
And therefore for 2 images from the same sensor the sum (or integral) of the y-axis readings should be constant. But for the two figures in this book they are demonstrably different.
What have I misunderstood?
