OK. The simple answer, which
@Alastair gave a while back, is that DPI means nothing whatsoever in an image file.
The less simple answer is that, so far as I can see, it's a bit of historical baggage. Back in the days when people were drawing up the standards for JPEG files, they thought it would be a good idea to include a field in it which would help tell a printer how big the image should be printed. If the image is 1000 pixels across, and you want it printed at 10", then set the DPI tag to be 100. That sort of idea.
The thing is though, it's a bad idea. Suppose you now want to print the same image at 5". The logic says that you have to edit the image to change the DPI tag to be 200. But why? It's the same 1000 pixels. It makes no sense whatsoever to have two image files which are identical in their contents, but one prints at 10" and the other prints at 5". It makes far more sense to just ignore the DPI tag, and tell the printer how big you want the print. So that's what everybody does.
So why does your camera set the DPI tag to be 240? Well, the EXIF standard says that it has to set the tag to something, even though nobody will take any notice of it. At least it's not 72. To illustrate how thoroughly muddle headed the standard is, it says that if the DPI tag is missing or corrupted, it should default to 72. That value was apparently chosen because, in the days these things were being decided, 72 PPI was pretty much the standard resolution for computer monitors. But if you think it through, an image file trying to tell a computer monitor how big it should be displayed makes even less sense than an image file trying to tell a printer how big it should be printed. Monitors display the pixels they're given and that's it: a monitor with a screen resolution of 72 PPI has no option but to display at 72 PPI.
I bet you wish you hadn't asked now.