Are the days of professional Photographer Numbered?

The idea of huge MP (or even GP - Gigapixels) is self defeating because you would need computing systems several orders of magnitude faster than we have today to just process the images.
Why won't we have it? There was a comment on The Infinite Monkey Cage the other day that stated that there was more computing power in an electronic greeting card tune device than was in every computer on the planet 50 years ago!
Can you imagine trying to process RAW files of 100MB or so?
Can you imagine trying to process a 30Gb file on a 486 15 years ago?
I hope that any real innovation will come with sensors with far less noise and higher sensitivites than we have today.

THEN you could have real HDR in camera using the increased pixel count not to produce super large images but to produce over exposed, normal and under exposed pixels which could then be combined in camera to produce HDR with a vastly increased Dynamic range and all in a single shot.
This has lots of possibility doesn't it?
 
Why won't we have it?

Because computer speed increases seem to have really slowed down.

Years ago we would see a doubling of speed almost every 12-18 months.

Now the best that seems to be happening is slowly upgrading the different devices which make up the computer.

So we now have 2TB hard drives and solid state drives, faster USB connections etc, but the speed of processors now seems to be fairly stagnant.

Intel's latest processor the i7 has a clock speed of 3.06 GHz, not really much of an increase over processors of say 5 years ago.

One of my computers has a clock speed of 2.8 Ghz and that's 10 years old approx.

And of course I realise that the performance of the 1i7 is vastly superior to my computer but now computer performance is coming through incremental improvements to the core processor and ancillaries like solid state drives not to processor speed.

And such improvements will slow down steadily.

The next big breakthrough is likely to be parallel computing and Quad core is really only the precursor to that.

But a parallel processing computer is likely to be very expensive, even when it does arrive.
 
THEN you could have real HDR in camera using the increased pixel count not to produce super large images but to produce over exposed, normal and under exposed pixels which could then be combined in camera to produce HDR with a vastly increased Dynamic range and all in a single shot.

I already do :D
 
Because computer speed increases seem to have really slowed down.

Years ago we would see a doubling of speed almost every 12-18 months.

Now the best that seems to be happening is slowly upgrading the different devices which make up the computer.

And such improvements will slow down steadily.
I think you are mistaking what you can buy (or have bought) and what is being done.

I've not seen any evidence that Moore's Law - the numbers of transistors on an IC doubling every 18-24 months - is 'slowing'. Yet. They will run into a physical limitation @2020 when the electron drain between components around 10 atoms thick will lead to power loss and data corruption with current technology, but I expect imprint lithography, silicon nanowires, phase change memory, spintronics, optoelectronics, or something else will have found a way to push the development onwards by then.
 
Back
Top