Just and afterthought after reading articles about AI doing better than doctors at diagnosing cases.
ChatGPT beat doctors at diagnosing diseases is one of the best eye opener articles that could tell a possible future given a few decades from now. Remember back in in the early 1990's and up just before the internet was a thing? life was much different back then until the tech blew up and information became more accessible than before.
Access to reference textbooks were expensive and so did education and this left a void in skilled manpower. AI fills the gap in knowledge since it's been trained to explain concepts like you're 5 years old at times.
Patients would often come with a diagnoses of their case in mind because they are Google MDs but now it's going to be a wave of ChatGPT MD users coming in for consult and they may probably hit the mark better. I'm not doubting that this tech will change the way healthcare is handled but this leaves up still open to human error if the human isn't competent to know what they should be prompting for.
As a physician teaching interns exposed in the clinics, I tend to see how reliant the new generation is when it comes to their google searches and AI rather than come in to duty prepared with good foundation from the classroom. Even with their gadgets, they still fail to get some points right because it's still an error in how they search for answers and making sense of what the information were presented to them.
The shit test I often do with interns is asking why are they giving me these answers - even if I know they are right, it doesn't mean they know understood how right they are. Let's say, I ask about why they take the blood pressure, and they report to me the possible deranged values and significance of the normal values in relation to the case, sometimes they can't connect why they get these results, they just know but never ponder on why they are getting the answers as they are.
If someone tells me they have a a raised blood pressure now, I already came up a mental bucket list of things that could explain why but I'll further ask which applies to the case and have that inner voice in my head saying, am I really right or did I miss anything? I don't get that energy from the younger generation of interns. It just seems like people are content with the convenience that they are given without the skill set to validate it or have a vision that maybe the tech they rely on can at times be unreliable.
I'm not really concerned about AI replacing doctors since the tech is a supplement and not a substitute. The tech relies on prompts, right and accurate prompts lead to better information relayed back to the one asking, but if the one asking doesn't even know what they're looking for, that could lead to a dangerous set of events in healthcare. I've personally used ChatGPT just to knowledge check what I already know and what I don't know.
Thanks for your time.
@adamada, I'm refunding 0.993 HIVE and 0.281 HBD, because there are no comments to reward.