When the camera first takes a picture it has the raw data to begin with. Then it does all that stuff to the raw data which can cause clipping, and then it throws loads of detail away in reduction to 8 bit files and JPEG compression. The raw data then gets discarded by the camera, and you are left fudged, with possibly blown highlights and limited scope to make adjustments.
If you shoot to raw then you can freely make many adjustments to the data before conversion to the final JPEG, thus making sure that nothing you do forces any pixels into clipping (if they weren't clipped at the time of capture). Even if there was some clipping in one or two channels, good raw software can make an educated guess about what the data might have looked like, if it only has to "mend" one channel, or perhaps two. You also have a lot more data to play with, giving finer tonal gradation and less risk of posterisation when pulling and pushing data. JPEG is fine if you can shoot perfectly in camera, but if you want to retain freedom to fix up some problems, or simply squeeze the maximum IQ from your camera, raw is the way to go.
Shooting raw is not about "taking photographs". It is about "capturing data". The idea is to capture as much data as possible, which means applying a different approach to setting exposures, specifically exposing to the right. You will make the photograph later, in your own good time, and not accept whatever conversion the camera spews out based on the settings you had at the time. If you prefer, shooting to JPEG is like having a baked cake. You can't unbake it. Shooting raw is like assemblling all the ingredients from which to make the cake. You then do the baking later on. If you don't like the results you can go back to the ingredients and try again. From a post by ttodd.
Basically you convert your raw files to jpegs, but you decide how to develop your image rather than the camera. If the jpeg is processed within the camera, you have less scope on how the final image looks.