All images start as Raw, which is the naked image file unadjusted for anything except exposure level. So you can do anything you want with it later, within reason, and 99% of the time it will then end up output as a JPEG anyway.
When you shoot JPEG only, you decide how the Raw will be processed before you take the picture, by setting the in-camera image processing paramenters (white balance, sharpness, contrast, saturation, noise reduction etc etc). They are applied instantly, and the Raw is deleted. (Whether you shoot Raw or not, what you see on the LCD is a tiny JPEG processed in just the same way, and that also drives the histogram/blinkies and contains the Exif data.) It is tagged to the Raw for reference purproses.
Put that another way, if you shoot Raw, and then post process it using exactly the same settings as applied in-camera, the result will be identical. In that sense, there is no advantage to shooting Raw - might as well let the camera do it. So unless you want to do something in post processing that you can't do with the basic in-camera adjustments, there's no point in Raw. But if you make a mistake, there is less scope for changes or recovery with the JPEG, because you can't change any of the data that has been discarded, if you find you need it after all.
To cover all your options, shoot Raw and JPEG. Adjust up your camera pre-sets carefully, set the white balance and exposure just as you want them, and you'll probably never need to use the Raw. But it's always there just in case. The only downside is in memory card space (and sometimes processing speed) because the Raw files are about four times larger than a best quality JPEGs. But at less than £20 for 8gb or memory card these days, memory is cheap.