Tuesday, June 11, 2024

Artificial Intelligence shouldn't make the final decision in urgent situations

 A.I. (Artificial Intelligence) has been a major part of the conversation within the last few years. 


A.I. can analyze data faster, and with that, it could threaten jobs.


(Note how many times people told me that A.I. would make my library career goals obsolete).


But A.I. isn't a God. It's not even human.


A.I. is just a mish-mash of data inputted by some humans. It doesn't record all of human knowledge. 


And more importantly, A.I. doesn't even have empathy. 


Empathy is something that is grown through experiencing emotions, whether it's you experiencing situations, or you learning about other people's situations. 


And when it comes to life-or-death situations, empathy has to be part of the equation.


When it comes to life-or-death situations, it should be human intelligence, not artificial intelligence, that makes the final decision. 

 

The following is from a recent article by a medical doctor.


 Dr. Eric Snoey, “Wait Times Go down. Patient Satisfaction Goes Up. What’s the Matter with Letting Apps and AI Run the ER?,” MSN, May 19, 2024, 

https://www.msn.com/en-us/health/other/opinion-wait-times-go-down-patient-satisfaction-goes-up-whats-the-matter-with-letting-apps-and-ai-run-the-er/ar-BB1mEHxn


The basic problem with hospitals’ growing obsession with efficiency is this: Algorithmic systems treat all patients the same, expecting precise, like-for-like responses to every question with just the right amount of detail. Except every patient is unique. And they tend to give up their stories at their own pace, in broken, non-linear fits and starts, sometimes conflating truth and fiction in ways that can be counterproductive and frustrating, but also uniquely human. I am often reminded of Jack Webb in the old TV series “Dragnet” imploring a witness to offer “just the facts, ma’am, just the facts.” In real life, whether from situational stress, self-delusion, superstition, health illiteracy, mental illness, drugs or alcohol, my patients' initial version of their complaint is rarely "just the facts” or the final word on the subject.


The doctor later mentioned about his situation dealing with a patient while working alongside a resident who used A.I. to analyze the patient's data


And so, I complimented my resident on her list of concerns but suggested that we spend a little more time with the patient. The story of her symptoms didn’t feel complete. I recommended my resident grab a chair and simply ask the patient about her life. What emerged was the chaotic picture of an exhausted part-time student by day, working two evening waitressing jobs and surviving on pizza, pasta and energy drinks. She had always had a "fragile stomach."
Our list of reasonable diagnoses was expanding and contracting, replaced with irritable bowel syndrome, food intolerances, gut motility issues, all overlying a stressed individual barely keeping it together. The labs, ultrasound or CT scan initially proposed now seemed irrelevant.
The result: The patient got out of the hospital faster. She received helpful suggestions about stress reduction, diet and sleep habits. She got an appointment with a primary care physician and avoided thousands of dollars in tests. Had we just relied on tests instead of asking a few more questions, there is a good chance we would have missed the best approach to her problem entirely.


Basically, the doctor asked the patient about her life situation and used that information to figure out how to best help that patient.  


That's how I wanted to be treated when I receive medical care!

--------------------------------


As for my writings, I could put out more content if I used Chat GPT. 

But that wouldn't be honest. 

The whole point of writing my blog is to share MY perspective. 

MY perspective. 

Not a machine's perspective. 


I approach writing as sharing my unique individualized thought process. 


There's nothing unique nor individualized about using an A.I. program to spit out content.


Yes, I used spell check and Grammarly to edit what I'm writing. I also use a Citation Generator to help me properly note my information sources. 


 But that's all the A.I. I used in writing my blog posts.


I wish Grammarly had been around when I started blogging, it might've saved me from embarrassing grammar errors that I only found when reading some of my posts years later.


While Grammarly can give me suggestions for improving my blog post's grammar, I can ignore them when I want.  


 I make the final decision of what appears on my blog. 


After all, it's my name on the blog.