The great RawPeg Debate !

I have to add my two-pence as a locally renowned legend in my own bath tub :)

I make galleries often for clients which involves resizing and saving in jpeg from the original jpeg saved from CS5 raw - so two jpeg saves. I can see a difference when the jpeg hits the web but not with the two files side by side in CS5 so i think there is more lost visually when things are published after being saved and the huge differences in the way different browsers and systems see jpegs is far more of an issue than how often the file was saved.

is the file being compressed by the gallery software ?

I just tried it with two of my own images - one opened duplicated and resaved 5 times, and the same one opened and only saved once - I can't see any difference at all on monitor even at pixel peeping level
 
Just another perspective here which I haven't seen mentioned (recently)-

Back in the film days each film had different characteristics. You chose the film for how it portrayed your subject. RAW files give you the option to choose which film you used after the fact. JPEGs are the result of your camera deciding which film to use.

Aren't they?
 
Just another perspective here which I haven't seen mentioned (recently)-

Back in the film days each film had different characteristics. You chose the film for how it portrayed your subject. RAW files give you the option to choose which film you used after the fact. JPEGs are the result of your camera deciding which film to use.

Aren't they?

Not really. The camera will output a JPEG with whatever processing parameters have been pre-set. Canon's Picture Styles - Portrait, Landscape etc, a load more that can be downloaded from Canon website, or your own choice of home grown parameters. Fujis have JPEG pre-sets based on their film emilsions, like Velvia Provia etc.
 
Same with Nikon, and also LR will allow you to add the same settings in post... either to RAW or JPEG

R7ZEOCr.jpg


And a little known, or used feature of Photoshop is the LUT (Look up table) function.

FnqCBgJ.jpg



The biggest deficiency with JPEG is not the compression so far as I'm concerned, but the fact that it's 8bit. Making strong adjustments to a bitmapped 8bit file can have serious consequences to fine tonal gradations that 12 or 14bit RAWs, or exported 16bit TIFF files simply do not. Editing in 8bit sucks.
 
On the resave issue, I tried it a couple of years ago. I understand (I have not tested this, so it's hearsay) that different programs may compress slightly differently; my tests used Photoshop, with maximum quality selected for each save. The eye is better at detecting differences in a side by side comparison, but I didn't do it that way. Just open/save/open/save etc. On this basis, I could easily detect on my monitor when viewing the whole image, not at 100% , a very obvious inferiority after 6 saves. Looked at larger, and with a side by side view, I would expect the difference to be obvious earlier in the process.

I can see the force of converting from 8 to 16 bits and editing in that (although you've lost information in the original conversion to 8 bits), but the problem there is that I suspect most people don't do this; and I also wonder if people come back to an image later and see other improvements that they'd like to make and reopen and re-edit. I may not be typical, but I rarely get it in one session. I'm a believer in printing and viewing the print for a while (meaning not on the same day as printing, but over several days) before deciding finally what I need to alter. Yes, I could convert to 16 bits at the start - but in that case I'm starting from an image that has already lost some data.

One very big caveat when we come to considering my experience and practice is that I work with scanned medium and large format film, not a camera produced jpg, and my prints (except for test ones, which are A4/10x8) are A3/12x16 and larger. My tests were carried out using an image from a digital camera just for my own satisfaction to check up on the conflicting claims I'd heard about jpg saves; and insofar as my own practice involves a series of edits over a period of time, it was a fair test (in my eyes) of what would happen if I followed my normal methods but starting with a jpg.
 
There is only one significant reason to shoot JPEG these days, and that is because you don't have access to a computer (for whatever reason). JPEGs are good to go straight out of the camera, but if you do use a computer (for whatever reason) then you'll often get a better result from a Raw file (but not always) and there's no more work involved.

FWIW, I shoot Raw and process in LightRoom.

To reiterate, as I said above - the only real reason today not to shoot Raw is because you don't have access to a computer, or don't have time - eg professionals sending images direct from a football match directly to the picture desk.

Though it's possible to do more post processing of a JPEG than some folks give credit, and any loss of quality will most likely go completely unnoticed, why take that risk?

Another point about LightRoom - it's a non-destructive programme. In other words, when processing from Raw, all the edits are saved as a separate sidecar file and only applied to the image at output. The original Raw remains untouched, so if you want to re-edit the image (possibly years later) then those adjustments are simply added to the sidecar and the new output is another first-generation JPEG.
 
To reiterate, as I said above - the only real reason today not to shoot Raw is because you don't have access to a computer, or don't have time -

Nonesense. Whatever happened to personal choice?
 
Nonesense. Whatever happened to personal choice?

Of course no one would say you are not free to use whichever format you want, but you have the more options when it comes to editing, should you edit and putting aside the need for quicker capture or quick transmission straight out of the camera, if you use the RAW format. And are able to process the RAW files in a fairly competent way. ;)

More colours, more dynamic, more bits to utilise should the image need it. I'd rather start with the most options possible, and then choose which options I use, rather than start off hampered. Works for me, it may not work for others. That's their choice, but at least understand the options, and make an informed decision on what is right for you.
 
Nonesense. Whatever happened to personal choice?


Still alive and well :)

Richard is merely saying that there's no really good reason to shoot JPEG, not that you can't if you want to.
 
Still alive and well :)

Richard is merely saying that there's no really good reason to shoot JPEG, not that you can't if you want to.

That's not what he said, what he said is that there's no reason not to shoot raw. I'd go along with what he's said but disagree strongly with your interpretation.
 
For me raw almost killed the hobby. I loved going out and shooting but the dread of coming home and having to process tons of raws bugged me to the point where id eventually not bother with the camera.

I changed to JPEG and with appropriate in camera settings I saw no harm. If I were a pro, obviously I'd use raw, but as a non pro I found JPEG fine and helped restore some of my interest, as all I had to do when I came home was dump the contents of the memory card.

IMO doing something that kills it isn't worth an extra nanometer of sharpness or quarter stop of exposure that you may or may not get from raw.

*mobile phone reply.

I've had a go at this DSLR lark twice now. Once about 7-8 years ago (give or take) and then a second time at the start of this year (no it wasn't a resolution).

The first time, I felt the same as you - apparently shooting in RAW was the only way to go and "it" killed my interest. It was clunky and took time which seemed unnecessary.

This time, it (and Lightroom) has been a revelation. It has allowed me to get out of photos something which I didn't think was possible before. When I got the shot a bit wrong, there was a chance of rescuing it. Getting it right in camera is always the aim, but an unrealistic one for the newbie, every time.

So what has changed? Sure, PP has moved on. Sensors have more MP. Computers are faster. But that's all meaningless detail.

The big difference this time is: I WANT to take photos. I have kids and scenery I WANT to capture. The thought of spending a bit of time tweaking those images post is a plus to me now, not a minus.

So for me, that's the nub of it. A camera, a computer etc. are but tools. The card and format are simply tools. Why we take photos, what we convey with photos and why we want to wake up at stupid o'clock is of far more importance. Let's not lose perspective: great photos can come out of every camera and truly awful photos can come out of the best Hasselblad etc.

If shooting in RAW gets you out there taking more (and better quality) photos then that's a good thing. If it's JPEG, then brilliant. Who really cares if it works for you?
 
I used to shoot nothing but raw, but now shoot raw+jpg - what made the difference? - I changed from a camera that wasn't easy to set-up to produce reliable jpgs (Canon 40D, probably applies to most traditional DSLRs) to one that can reliably produce excellent jpgs and with a menu system that makes selecting jpg options very straightforward and logical as well as having easily selectable user-defined jpg presets (Fuji X-Pro1). With about 80% of shots I'm now printing the jpg produced by the camera, but the raw is there for the minority that need a tweak.

And as far as I am aware, Lightroom non-destructively edits jpg as well as raw.
 
That's not what he said, what he said is that there's no reason not to shoot raw. I'd go along with what he's said but disagree strongly with your interpretation.

What I said was, there's only one* significant reason not to shoot Raw, ie you don't have access to a computer, for whatever reason, and need finished pictures straight out of the camera. But if you do have computer access, then there are nothing but upsides to working from Raw, and no more work at all. Memory and speed are not the problems they used to be.

Picking up on some other comments above, I also used to avoid Raw because I didn't get on with all the post-processing malarkey. Lightroom changed all that, perhaps because it is very intuitive for someone coming from a conventional film/darkroom background. Fabulous programme IMHO.

*Actually there's another reason to shoot JPEG, and that's for long sequences in continuous drive mode at high fps. Most cameras get buffered up pretty quickly shooting Raw. Maybe that's another reason why so many pros at the Olympics shot everything as JPEGs.
 
Last edited:
What I said was, there's only one* significant reason not to shoot Raw, ie you don't have access to a computer, for whatever reason, and need finished pictures straight out of the camera. But if you do have computer access, then there are nothing but upsides to working from Raw. Memory and speed are not the problems they used to be.

Picking up on some other comments above, I also used to avoid Raw because I didn't get on with all the post-processing malarkey. Lightroom changed all that, perhaps because it is very intuitive for someone coming from a conventional film/darkroom background. Fabulous programme IMHO.

*Actually there's another reason to shoot JPEG, and that's for long sequences in continuous drive mode at high fps. Most cameras get buffered up pretty quickly shooting Raw. Maybe that's another reason why so many pros at the Olympics shot everything as JPEGs.

Lightroom is also the significant decider for me in the raw vs jpeg choice.

Every shot i take is put through lightroom and exported as required with any lightroom adjustments. These exports are mostly web based, and lightroom does this very well as it doesnt create any extra files.
 
I forgot one thing - the biggest drawback with JPEGs. You immediately lose about one stop of highlights headroom with in-camera JPEGs. They just get chopped straight off with no possibility of recovery :thumbsdown:
 
Regardless of final output and whether you are a professional photographer or not, it's always about the image and how evocative that image is.

RAW allows far more scope to realise and express what you wish from the original picture you captured with your camera.

Nothing wrong with JPEG it's simply what suits you best individually - Which you feel more comfortable with. For me it's exclusively RAW until a final output is saved as a JPEG.
 
RAW all the way, have given up shooting JPG now I have LR sort of tamed.

Here is a good example, a bit of inattention from me and a very odd light source (light diffusing through green canvas tent!) and this was the result

5D3_0193_1000 by david.williams221162, on Flickr

A bit of attention in LR and this was the final result. Now I must admit that fiddling with the above jpg I could get something like the image below but the quality wasn't as good.

5D3_0193_1500 by david.williams221162, on Flickr

David
 
The proof of the pudding is in the eating.

So cooking from RAW potentially results in a Cordon Bleu high standard of pud. :D
 
No particular reason for me, but since all my photo's end up in Libraries in Apple Aperture anyway and thus the workflow is exactly the same...Why not shoot in RAW? There is just no compelling argument against it to me, no additional work, so why not utilise the proper digital negative? Heck even when using the camera connection kit with my iPad I can see and use the photos on there...

But each to their own, I won't loose any sleep over someone else's decisions :)
 
Last edited:
I shoot raw for everything. There are times when I might NEED raw. There are times when Jpeg would suffice, but in those circumstances, raw will not be detremental at all* so I use raw all the time as I can't be bothered with the faff of changing formats, just pick up the camera and get to work.



*There are some occasions when raw may be detrimental, for sports when you want to make the most of your buffer, or events with near instant on site printing for example. But I don't do that kind of work myself.
 
Last edited:
Back
Top