The Playstation example is a bit of a clupea rubra, because the manufacturers take a loss on every Playstation or Xbox sold. They then make that up by charging the games publishers a fee for pressing the games. The games have a 2 year development cycle, and the games publishers want a large installed userbase of gamers with a given console in order to ensure they sell enough copies to make it worthwhile. So consoles only start to really take off - and get the best games (because software teams have gotten to understand the platform) and earn revenues - once they're 2 - 4 years into their production run. And the manufacturers do indeed want to stretch the consoles' lives as long as possible, so that they sell more games before the loss-making release of the next generation. The only thing that spurs them to release a new console is that if they wait too long their competitors will yell "we have teh new über thang" and they'll look uncool and irrelevant (which is important in that market) and get left behind in the numbers of devotees to their platform in the next generation....
Of course they squeeze every last penny out of 'current' technology before releasing a 'new' one, it's how they repeat the cycle.
Consider the speed that technology progresses, then consider the life span of a device or product line. Which one evolves faster?
Consider Playstations for just a minute, logically now and don't rush.
Do Sony release a new console every few months? the tech for consoles outdates the current lines in roughly this format, would Sony drop their current line just because they discovered a better, more efficient or improved design?
Can you imagine the losses if they prematurely released new lines? When current lines are still generating profit?
I can kinda see where you're coming from - "current technologies work ok, so why mess around with a disruptive technology?" but the loss-per-sale makes the games console industry fundamentally different. Camera manufacturers make a profit on every lens or camera sold [1] so the incentive for them to make a "generational leap" is greater, assuming they see a way to make a huge image quality improvement (or cost savings).
I have to say I'm in mixed minds about this. When I first read Hoppy's original post, I did think he was blowing it out of all proportion, and I wondered if the thread would become as polarised as this. But then again, every piece of glass that a manufacturer puts in a lens decreases the quality of the image some - they're enlarging the projected image with one lens element, sharpening it with another, trying to correct aberrations with another and so on. Modern lens elements may be very good indeed, but it's a fundamental law of photography that lenses are lossy.
When I read Hoppy state that "all other DSLRs are just film cameras with film-based lenses" I kinda wanted to characterise that as "all other DSLRs are trying to optimally focus an image on a light-sensitive surface, lol!" But you can see that if a manufacturer were able to consistently reduce the number of elements in their lenses and do the same thing in software then they'd be crazy not to - each lens element that rolls off the production line costs money to manufacture, software only costs you once to write.
I'm a full-frame fancier, and I like the whole looking-in-a-viewfinder, seeing-what-the-camera-sees thing. I haven't used liveview, and SLR just makes sense to me. I don't want to buy into Hoppy's argument, but he does make some good points along these lines, such as his talk of getting rid of the mirror.
I guess a big part of the question here is whether sensors & software will become good enough. I am also pretty flummoxed as to how Canon would migrate from their current huge range of EF lenses into some radical new system.
Stroller.
[1] I would imagine the majority of buyers of the Canon 1000D only ever use the kit lens, or maybe one other, so I can't see cameras sold as a loss-leader to get people to buy lenses.
How do you know that were not? I could be on a shoot right now 