I suppose it's easier if you ever shot film
There really isn't a consensus on this, but if you think of it like this it might help:
When light hits your sensor it creates a load of electrical signals that include lots more than just the image data you'll need. Your camera can be programmed to produce a JPEG image from this data, using a set of parameters for sharpening, saturation etc (either selected by the camera or photographer). Or you can download this Raw data and create the image yourself by altering a set of parameters in software.
history:
In the old days - most photographers shot print film, took it to the chemist and got a set of prints (these would all have been colour enhanced etc by a machine). Some photographers paid more for their prints to be assessed and for some effort to go into making sure the processor was getting the best out of the negative.
A few people shot slide film, which was chosen for it's particular 'look' it was also possible to manipulate certain colours on the film by under or over exposure, which led to lots and lots of bracketing and the 'best shot' being selected (for it's colour profile). And still slide film shooters try to pretend they're the only people who don't take advantage of processing because the end product is SOOC
So now we've got our JPEG (or TIFF etc), if we start to retouch it at a pixel level, cloning, smoothing, selective sharpening etc, that's image manipulation. ANd in the old days it would have been done to the negative / slide / print by a skilled retoucher.
So; this is still as old as photography itself and should be accepted as part of the 'photographic process', the only difference in the digital world is the accessibility of the tools.
You can take it any way you like but there's no such thing as an 'unprocessed image' that you can view. Generally the belief that all of this is somehow 'cheating' is borne out of ignorance.
I hope that helps.