I don't keep ignoring it. Most people don't view your images as prints, nor will they view on a calibrated monitor. If they look on a TV panel, they're mostly around 300 - 400 out of the box, gaming monitors are roughly the same, at least 2 manufacturers have panels on the market today at around 1000 and we're told to expect 2000 within 18 months.
So you're suggesting that calibration is not necesary?
Yeah.. many people will view your work on a variety of crap, however, ensuring that it's is indeed created on a correctly calibrated screen ensures that should it be printed, it is likely to be printed more accurately. It also ensures that other industry professionals, who are likely to have a calibrated workflow will view it as correct also. LOADS of peopel print. Maybe not amateurs, but you talk as if printing is some long lost forgotten and archaic process
They can carry on making panels that can sear your retinas as much as you want but a phtographer will not be calibrating them to such a ludicrous brightness level, as it simply wouldn't be accurate.
120's great if you're end viewers are looking at screens of upto 400 or so, but once there's a large user base (and OTT video providers like Netflix and Amazon are rolling out this year, DVB are standardising broadcast delivery at the moment) then photographers will need to look at how images should be displayed on these mediums.
There is simply absolutely no need for a screen of such brilliance when editing still imagery. The dynamic range of the camera gear is nowhere near sufficient to make use of it, and calibrating so you have a maximum brightness of 1000 or greater cd/m2 would mean it is simply not accurate. You are making the mistake of thinking that because the screen has a high dynamic range it needs to be calibrated for still photography differently. It doesn't. Plus.... can you imagine working on an image where the highlights are 2000cd/m2 for long periods of time at a desktop monitor? Sorry.. that will be a health hazard. It would be like staring into a bright sky for hours on end. I'm sure such screens are great in home cinema applications, but seriously, if you had a high key image and zoomed in on highlights on a 30" screen with a luminance value like that, it would be impossible and unhealthy work on it. An average halogen car headlight on full beam is measured at between 3 and 6000 cd/m2. Modern HID headlamps around 8000 to 10,000 (which is why European law insists on tighter beam patterning and auto self levelling).
Do me a favour... go switch your headlights on later when it's dark.. put them on full beam, and sit 1 metre away. Stare straight into the beam. Let me know how long you can stand it for.. OK?
You keep posting specs to these screens, but I'm almost certain that they're only real application will be in home entertainment, and not in serious digital imaging, publishing and printing houses.
There's just absolutely no need. It would be a work health hazard, and it would be impossible to accurately proof any work.
Also, these screens are designed for a completely different market that so far, would never even be remotely used to their full potential by footage created in any camera currently available. Also, I would be sceptical of believing these claimed brightness levels. Be aware that many screen manufacturers make equally outlandish claims about contrast ratios that in reality are never actually achieved. You see £200 monitors claiming 10,000:1 contrast ratios, yet something top end like a Eizo CG series or a NEC Reference monitor will produce a contrast ratio of around 650:1.
BTW.. apart from the industry white paper you linked to, is there anything else regarding these screens? Do any actually exist yet?