ChatGPT

JohnC6

Suspended / Banned
Messages
11,799
Name
John
Edit My Images
Yes
On LBC Radio this morning a father was relating how his 16 year old son (now 18) had been going through all sorts of tests to determine why the lad was so tired and lacked the energy to do very much atall, suffered weight loss, had night sweats and other symptoms that I can't recall. The doctors were getting nowhere. One day he decided to gather all the blood test results and any information whatsoever that they had been given. He went through the copies of reports that doctors and hospitals send to patients and, after feeding it all into ChatGPT asked the very simple question." What do you suggest is wrong with this patient taking into consideration all these test results and reports ? " I'm paraphrasing. It didn't take long for an answer. Leukaemia.

So..he went to the doctor and told him what he'd done and the doctor agreed to ask for a test for that illness. You can guess what the result was . Leukaemia. The lad was immediately put on chemotherapy and whatever else was required. Surely, it's been a case of the person who was assessing results overlooked some indications that it was leukaemia ? This, though isn't the point of my post but the role of ChatGPT. I've heard of it but never given any more thought to it. I'll take a look.
 
Unfortunately our doctors are far too quick to dismiss you with an aspirin. It’s all about targets.

My son’s girlfriend has been seeing numerous doctors over the past 6 months about issues with her legs. Nobody has come to any conclusions as to the problem. She has been given all sorts of possible issues, even a rare disease that could see her in a wheelchair for the rest of her life which scares the crap out of her.

My son who was more often than not with her at the appointments and has recently qualified as a personal fitness trainer asked a few times if the problem could be stress fractures due to her road running but was dismissed on all occasions.

Finally a few weeks ago they found another doctor who thought it best to take an X-ray (something non of the other doctors had bothered to do) and the diagnoses was…… Stress Fractures.

And we wonder why our waiting lists are so bloody high. Perhaps we should just get rid of all of them and use ChatGPT
 
It doesn't really take an AI bot, usually it's about looking at the whole person holistically as you've described. Sadly diagnostic skills seems to have become a thing of the past, as primary care relies of diagnostic pathways instead - flowcharts. If the path isn't documented, the illness doesn't exist. Or it could be may illnesses if you take just a few symptoms each time.
My brother in law complained of neck pain and loss of balance, kept falling over. Loads of tests and they thought for a while it was MS. Then someone bothered to point out that he was a painter & decorator and worked 7 days a week and long hours putting his kids through private schooling. So they took an X-ray of his neck. They found a trapped disk between a couple of upper vertebrae, because he was always craning his neck back to paint ceilings.
 
I, in part, echo what you all say above ......

I have a personal preference as to which of the GPs in the practice I make appointments with as needed.....because two in particular were thoroughly underwhelming based some specific interactions I had with them.

One was his complete lack of listening to me and curiosity. The other was in the case of steroid injections for trigger finger :(

As in all walks of life there are doctors who are exceptional and others who lack a certain 'something'. I have interacted with both extremes both when I worked in pathology in the 1970's and seeing GPs and other more specialised ones as an outpatient over the decades.
 
So about a long time ago, I studied for a degree in CompSci. One of our modules was on "Expert Systems" which are somewhat forerunners of some modern AIs. They aimed to capture the experience of subject matter experts and distil them into a rule base so that a user could enter criteria and the system would compare the pattern with what a consensus of experts would say.

One of the most famous was called Mycin which diagnosed post operative infection. You can read all about it on the Wikipedia link. It was roughly 65% accurate in diagnosing and prescribing meds. That sounds terrible until you read that the experts at the time were about 50% accurate. It was about as good as the best experts in the world and significantly better than a GP in a field where better = fewer deaths.

However, after learning all about this exciting "new" technology we were told the bombshell - it was never used. Not because it didn't work but because people didn't trust it. They expected an AI to be 100% accurate even if no human could ever approach that.

How times have changed.
 
I think the nub of the issue is that it is fine to use AI, or an expert system, or even a pathway, but retain the human ability to weigh the suggestions as part of the diagnostic process, not just accept the given answer unquestioningly.
 
I think the nub of the issue is that it is fine to use AI, or an expert system, or even a pathway, but retain the human ability to weigh the suggestions as part of the diagnostic process, not just accept the given answer unquestioningly.
And yet we unquestioningly accept the deliberation of an expert human without understanding the process.

House is wrong at least 5 times as often as he's right.
 
Is it possible that the emphasis in medical training has shifted from "looking and listening" to "fixing and forgetting"?

Older doctors have always seemed to me to be slower to make a diagnosis and more likely to say "try this and we'll see how it goes" whereas younger doctors tend to make diagnoses more quickly and then get on with the next thing.

I write purely as a patient.
 
I think it's akin to the SpaceX and IT approach of "fix in production" - try this pill, if it doesn't work we can try another. Each time you go away with a solution, even if it may not work, so the stats look good.
 
I think it's akin to the SpaceX and IT approach of "fix in production" - try this pill, if it doesn't work we can try another. Each time you go away with a solution, even if it may not work, so the stats look good.
That was very much the case in getting BP medication 'right for me'.
 
I think the nub of the issue is that it is fine to use AI, or an expert system, or even a pathway, but retain the human ability to weigh the suggestions as part of the diagnostic process, not just accept the given answer unquestioningly.
Yes, I often hear folk say how their SatNav makes odd decisions sometimes......my approach re: SatNav is that I have in mind a general routing that makes sense and allows me the perspective to consider how beneficial the SatNav can be! In other words don't be a slave to the technology;)
 
Back
Top