Calibrated Monitors, editing for display?

  • Thread starter Thread starter Deleted sulking member 63079
  • Start date Start date
D

Deleted sulking member 63079

Guest
So, I've been making do with my HP Envy Dv7 laptop for all my editing, but, I've just added a Dell Ultrasharp U2415h to my setup.

What a difference! But also, now a dilemma. Most people who look at websites etc, aren't going to be doing so on a calibrated monitor.

My laptop (and my wife's Mac) show the whites much brighter than the Ultrasharp, with a lot less detail. So if I edit it "correctly" on the Ultrasharp monitor, it looks over exposed on the laptops.

Do I just not worry about it and stick with the calibrated display? Or edit in two ways, one for electronic consumption using the laptop screen which will most likely match how clients view it? and again on the calibrated display for printing?

Or just once on the calibrated screen and not worry about it?
 
Just once for the calibrated screen.

If you tilt a laptop screen it changes the exposure/brightness, you won't do an edit for each degree of the laptop screen.

The only time I do a different edit is for Facebook posts and there are guides online to get the best out of Facebook compression.
 
So, I've been making do with my HP Envy Dv7 laptop for all my editing, but, I've just added a Dell Ultrasharp U2415h to my setup.

What a difference! But also, now a dilemma. Most people who look at websites etc, aren't going to be doing so on a calibrated monitor.

My laptop (and my wife's Mac) show the whites much brighter than the Ultrasharp, with a lot less detail. So if I edit it "correctly" on the Ultrasharp monitor, it looks over exposed on the laptops.

Do I just not worry about it and stick with the calibrated display? Or edit in two ways, one for electronic consumption using the laptop screen which will most likely match how clients view it? and again on the calibrated display for printing?

Or just once on the calibrated screen and not worry about it?


Edit on the correct screen.

BTW.. is your U2415 calibrated? Did you calibrate it?
 
Edit on the correct screen.

BTW.. is your U2415 calibrated? Did you calibrate it?


I haven't, will be doing so this weekend. It came with a calibration report from their own calibration though, and most reviews suggest it's pretty close straight out the box.
 
When editing video or stills you edit on a calibrated reference monitor in reference viewing conditions.

As long as it looks good on the reference monitor, it's good. If the end viewer ruins it due to their hardware or settings, that's their problem.
 
I haven't, will be doing so this weekend. It came with a calibration report from their own calibration though, and most reviews suggest it's pretty close straight out the box.

I can assure you it is not. Not sure what reviews you've been reading, but the trusted ones (TFT Central and Prad) suggest it's awfully calibrated.

View attachment 39920

Max Delta e of over 5 is regarded as poor, and a luminance figure of 218 is STUPIDLY high. That black luminance level is decidedly dodgy too.

This matches my own findings with pre-calibrated Dell screens. Not worth the paper it is written on.

The de-facto standard is to calibrate to 6500K, 120cd/m2 luminance, gamma 2.2... and THEN manage your room lighting to match. A luminance value of over 200cd/m2 is really, really high. You'll almost certainly be producing dark images as a result of that.

The minute you manually start adjusting, then your calibration (using the term loosely) will be gone anyway.

Invest in a means of calibration if you are even remotely concerned about producing work that is accurate.
 
Last edited:
I can assure you it is not. Not sure what reviews you've been reading, but the trusted ones (TFT Central and Prad) suggest it's awfully calibrated.

View attachment 39920

Max Delta e of over 5 is regarded as poor, and a luminance figure of 218 is STUPIDLY high. That black luminance level is decidedly dodgy too.

This matches my own findings with pre-calibrated Dell screens. Not worth the paper it is written on.

The de-facto standard is to calibrate to 6500K, 120cd/m2 luminance, gamma 2.2... and THEN manage your room lighting to match. A luminance value of over 200cd/m2 is really, really high. You'll almost certainly be producing dark images as a result of that.

The minute you manually start adjusting, then your calibration (using the term loosely) will be gone anyway.

Invest in a means of calibration if you are even remotely concerned about producing work that is accurate.

Picked up the X-Rite i1 pro at the weekend, think I've gone through the calibration properly.

Manually set brightness to 120 and manually set RGB manually to begin with, changed the number of colour patches to large and let it do it's thing. It did make a noticeable difference. Then opened colour management, ticked the box to pick my own settings, added the new profile, set it to default then deleted the other one.
 
That shoudl do it... although the new profile would have been set to default anyway.
 
120 is fine for prints, web viewing video editing etc.

Only time you really need to change is if you have a High Dynamic Range screen.
 
120 is fine for prints, web viewing video editing etc.

Only time you really need to change is if you have a High Dynamic Range screen.

High dynamic range? Such as? The contrast ratio of a screen doesn't determine it's luminance value. That's determined primarily by the level of ambient lighting. The contrast ratio of the screen has no bearing on it.
 
SIM2, Pulsar .... There are a few around and it's the hot topic in video.

The ones I've played with do 0.01 to 4000 cd/m^2 with a high enough bit depth and transfer function to have no banding.
 
@st599

Correct is correct, and no matter the contrast ratio, the luminance for graphics and imaging is determined by a median value, assuming a normal contrast scene. The fact that a screen CAN go that bright is irrelevant, because if you calibrate it to 120CD/M2, that's all it will go to, because that's all it will be allowed to go to.

Why would anyone want a desktop display that is capable of 4000CD/m2? It would actually blind you. That's roughly the brightness of a car headlight on full beam from one meter away.. LOL
 
@st599

Correct is correct, and no matter the contrast ratio, the luminance for graphics and imaging is determined by a median value, assuming a normal contrast scene. The fact that a screen CAN go that bright is irrelevant, because if you calibrate it to 120CD/M2, that's all it will go to, because that's all it will be allowed to go to.

Why would anyone want a desktop display that is capable of 4000CD/m2? It would actually blind you. That's roughly the brightness of a car headlight on full beam from one meter away.. LOL

Dolby say you need 10000 in a TV screen. There's already a 1200 on the consumer market.

Many in the video industry believe it's the next big thing.

The idea is to keep averages near what they are now, but allow detail in the highlights and true speculars. The images are far more life like than current rec709. (you do need a decent 14+ stop camera though)
 
Dolby say you need 10000 in a TV screen. There's already a 1200 on the consumer market.

Many in the video industry believe it's the next big thing.

The idea is to keep averages near what they are now, but allow detail in the highlights and true speculars. The images are far more life like than current rec709. (you do need a decent 14+ stop camera though)


10,000cd/m2? LOL... don't be ridiculous. That would actually cause severe and permanent eye damage if you stared at that.

Can you please paste up a link to your source?

You sure that's not MICROcandelas?... not Candelas?

Anyway... if you calibrate a screen to 120cd/m2, then that is the maximum it will go to. Whatever the screen's maximum brightness capability is does not matter in the slightest. The profile will prevent it going any brighter. That's the POINT of calibration.
 
I definitely don't mean micro candelas. Dolby Vision is designed to reach 10000 CD/m^2.

(See http://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf ) Other HDR systems are under discussion.

Nor will it make you go blind, it's easily in the eye's adaptation range, but it may cause you to flinch.

You calibrate your screen, however sRGB isn't an absolute system, 255 is peak white, irrespective of what that is in absolute terms. So as screens become brighter, calibrations will shift. 120 was chosen as that's the CRT limit.
 
Picked up the X-Rite i1 pro at the weekend, think I've gone through the calibration properly.

Manually set brightness to 120 and manually set RGB manually to begin with, changed the number of colour patches to large and let it do it's thing. It did make a noticeable difference. Then opened colour management, ticked the box to pick my own settings, added the new profile, set it to default then deleted the other one.
Alright mate, I got a dell monitor that supports the hardware calibration. But all I do is let the xrite does its own thing. How do I go about set the brightness and RGB manually to the required levels?
 
Alright mate, I got a dell monitor that supports the hardware calibration. But all I do is let the xrite does its own thing. How do I go about set the brightness and RGB manually to the required levels?

If you change the mode on the monitor to Custom Colour, it then gives you the RGB channels separately. At the first stage of calibration, it asked me to tick what I had manual control of, so I said brightness and RGB. Then after setting the brightness to 120, it asked me to move the RGB sliders. I think mine needed 97 97 99 to hit the target white balance.
 
I definitely don't mean micro candelas. Dolby Vision is designed to reach 10000 CD/m^2.

(See http://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf ) Other HDR systems are under discussion.

Nor will it make you go blind, it's easily in the eye's adaptation range, but it may cause you to flinch.

You calibrate your screen, however sRGB isn't an absolute system, 255 is peak white, irrespective of what that is in absolute terms. So as screens become brighter, calibrations will shift. 120 was chosen as that's the CRT limit.


The colourspace has nothing to do with brightness. As you said, 255 is maximum, but it's an arbitrary figure. 255 red is more red in Pro Photo than it is in sRGB for example.

For photography, having a screen so bright it actually hurts (which the luminance levels you're suggesting genuinely would) is useless. A) Cameras can't actually capture such a stupid dynamic range, and B) even if they could, you can't print it.

So far as calibrating for photography goes, 120CD/M2 is a sensible choice for a moderately it room, and if you have calibrated for that luminance level, then that will be the maximum the screen will produce.. that's the point of calibration. You keep ignoring that very important point though.


Alright mate, I got a dell monitor that supports the hardware calibration. But all I do is let the xrite does its own thing. How do I go about set the brightness and RGB manually to the required levels?

Are you using the Dell software? You know if you use the X-Rite software it will not be hardware calibrated.
 
The colourspace has nothing to do with brightness. As you said, 255 is maximum, but it's an arbitrary figure. 255 red is more red in Pro Photo than it is in sRGB for example.

For photography, having a screen so bright it actually hurts (which the luminance levels you're suggesting genuinely would) is useless. A) Cameras can't actually capture such a stupid dynamic range, and B) even if they could, you can't print it.

So far as calibrating for photography goes, 120CD/M2 is a sensible choice for a moderately it room, and if you have calibrated for that luminance level, then that will be the maximum the screen will produce.. that's the point of calibration. You keep ignoring that very important point though.




Are you using the Dell software? You know if you use the X-Rite software it will not be hardware calibrated.


I don't keep ignoring it. Most people don't view your images as prints, nor will they view on a calibrated monitor. If they look on a TV panel, they're mostly around 300 - 400 out of the box, gaming monitors are roughly the same, at least 2 manufacturers have panels on the market today at around 1000 and we're told to expect 2000 within 18 months.

120's great if you're end viewers are looking at screens of upto 400 or so, but once there's a large user base (and OTT video providers like Netflix and Amazon are rolling out this year, DVB are standardising broadcast delivery at the moment) then photographers will need to look at how images should be displayed on these mediums.
 
Dolby say you need 10000 in a TV screen. There's already a 1200 on the consumer market.

Many in the video industry believe it's the next big thing.

The idea is to keep averages near what they are now, but allow detail in the highlights and true speculars. The images are far more life like than current rec709. (you do need a decent 14+ stop camera though)

Is the 10000 not the contrast ratio? Rather than the brightness?
 
I don't keep ignoring it. Most people don't view your images as prints, nor will they view on a calibrated monitor. If they look on a TV panel, they're mostly around 300 - 400 out of the box, gaming monitors are roughly the same, at least 2 manufacturers have panels on the market today at around 1000 and we're told to expect 2000 within 18 months.

So you're suggesting that calibration is not necesary?

Yeah.. many people will view your work on a variety of crap, however, ensuring that it's is indeed created on a correctly calibrated screen ensures that should it be printed, it is likely to be printed more accurately. It also ensures that other industry professionals, who are likely to have a calibrated workflow will view it as correct also. LOADS of peopel print. Maybe not amateurs, but you talk as if printing is some long lost forgotten and archaic process :)

They can carry on making panels that can sear your retinas as much as you want but a phtographer will not be calibrating them to such a ludicrous brightness level, as it simply wouldn't be accurate.



120's great if you're end viewers are looking at screens of upto 400 or so, but once there's a large user base (and OTT video providers like Netflix and Amazon are rolling out this year, DVB are standardising broadcast delivery at the moment) then photographers will need to look at how images should be displayed on these mediums.

There is simply absolutely no need for a screen of such brilliance when editing still imagery. The dynamic range of the camera gear is nowhere near sufficient to make use of it, and calibrating so you have a maximum brightness of 1000 or greater cd/m2 would mean it is simply not accurate. You are making the mistake of thinking that because the screen has a high dynamic range it needs to be calibrated for still photography differently. It doesn't. Plus.... can you imagine working on an image where the highlights are 2000cd/m2 for long periods of time at a desktop monitor? Sorry.. that will be a health hazard. It would be like staring into a bright sky for hours on end. I'm sure such screens are great in home cinema applications, but seriously, if you had a high key image and zoomed in on highlights on a 30" screen with a luminance value like that, it would be impossible and unhealthy work on it. An average halogen car headlight on full beam is measured at between 3 and 6000 cd/m2. Modern HID headlamps around 8000 to 10,000 (which is why European law insists on tighter beam patterning and auto self levelling).

Do me a favour... go switch your headlights on later when it's dark.. put them on full beam, and sit 1 metre away. Stare straight into the beam. Let me know how long you can stand it for.. OK?

You keep posting specs to these screens, but I'm almost certain that they're only real application will be in home entertainment, and not in serious digital imaging, publishing and printing houses.

There's just absolutely no need. It would be a work health hazard, and it would be impossible to accurately proof any work.

Also, these screens are designed for a completely different market that so far, would never even be remotely used to their full potential by footage created in any camera currently available. Also, I would be sceptical of believing these claimed brightness levels. Be aware that many screen manufacturers make equally outlandish claims about contrast ratios that in reality are never actually achieved. You see £200 monitors claiming 10,000:1 contrast ratios, yet something top end like a Eizo CG series or a NEC Reference monitor will produce a contrast ratio of around 650:1.

BTW.. apart from the industry white paper you linked to, is there anything else regarding these screens? Do any actually exist yet?
 
Last edited:
Dave, i take it using the dell software I don't need to fiddle with the brightness and RGB?
 
Dave, i take it using the dell software I don't need to fiddle with the brightness and RGB?


It shouldn't be necessary, no, but you can if you want to. Just make sure however you do it, that you are in custom colour mode.
 
hardware al, put the monitor into color space mode under CAL 1 and 2. i suppose if I do custom color i won't be able to hardware cal?
 
hardware al, put the monitor into color space mode under CAL 1 and 2. i suppose if I do custom color i won't be able to hardware cal?

My bad... you're right. It does indeed need to be in CAL, or you'll be calibrating the GPU LUT.
 
@ricky1980 Oooh... one more thing.. Struggling to remember, as I use an Eizo screen 99% of the time, but check whether the calibrator needs to be plugged into the monitor's USB port rather than the computer.
 
@ricky1980 Oooh... one more thing.. Struggling to remember, as I use an Eizo screen 99% of the time, but check whether the calibrator needs to be plugged into the monitor's USB port rather than the computer.
the monitor's USB cable needs to be plugged into the PC but the calibrator can be plugged into Monitor's or PC's usb hubs.
 
4000 screens definitely exist, I've seen one and seen the independent measurements. I've also chatted to the colourist who's used one for a test sequence.

http://www.mediaandbroadcast.bt.com/wp-content/uploads/D2936-UHDTV-final.pdf is a good starter.


You're still missing the point by a country mile. While you may want a screen that can provide searing brightness in a movie for transient things (flash of a bomb, or gunfire, or a panning sequence including a star in a sci-fi film) you do NOT want this on a desktop display for editing still images where such brightness is constant!! If I need to explain why, then there is no hope for you.

Calibration of screens designed for movie, home cinema, or as I suspect for a lot of these screens, outdoor use, is a totally different thing. TVs are calibrated to completely different gamma and luminance levels. There's no parallel.

No one would want to stare at a screen of more than 1000cd/m2 with an image with a lot of highilghts for long.. nobody. You'd just get a MASSIVE headache at best.
 
hardware al, put the monitor into color space mode under CAL 1 and 2. i suppose if I do custom color i won't be able to hardware cal?

Now I'm wondering if I've missed something on mine as I couldn't see a CAL 1 or CAL 2 mode on the monitor controls?

Will have another play tonight
 
What version of dell monitor have you got? Not all of them have CAL 1 and 2 options. Only the ones supports hardware calibration.
 
What version of dell monitor have you got? Not all of them have CAL 1 and 2 options. Only the ones supports hardware calibration.

u2415h, so that's probably it :)

The report from the X-Rite after calibration was pretty spot on, so that's fine, just wanted to make sure I wasn't missing something!
 
Back
Top