ChatGPT Gives Quality Information, More Empathetic Responses Than Human Doctor in Medical Questions
(Photo: Pexels/Matheus Bertelli)
ChatGPT Gives Quality Information, More Empathetic Responses Than Human Doctors in Medical Questions

ChatGPT beat an actual human doctor in answering medical questions after a panel of physicians preferred its responses. The new study shows that doctors could use ChatGPT to give patients quick and highly-informative answers to their queries.

ChatGPT Better Than Human Doctors?

A recent study headed by the University of California compared the performance of the AI chatbot ChatGPT and physicians in answering 195 medical questions to assess which was more compassionate, ScienceAlert reported.

Researchers retrieved queries from the "AskDocs" public subreddit. For instance, one guy inquired about the peril of ingesting a toothpick. Someone else questioned whether they would experience a concussion if they struck their head on a metal bar.

The queries on the subreddit were answered by a healthcare expert whose credentials were confirmed by a moderator. To come up with a response, the researchers also ran the queries through ChatGPT.

A panel of doctors was asked whether the response-that of the chatbot or that of the physician-was superior. However, they were unaware of who provided whose answers.

With a total of 585 evaluations, each case was evaluated by three distinct judges, whose scores were then averaged.

The chatbot responses, which had better knowledge and more sympathetic language than the physician responses, were favored by the judges in 79 percent of cases.

The chatbot responses were also four times longer, averaging 211 words per post, compared to the physician's answers which only averaged 52 words. ChatGPT's responses were ten times more empathetic than the doctor's.

According to the study, chatbots may be more effective than a busy doctor who volunteers to answer inquiries online, giving the appearance of a caring bedside demeanor.

ALSO READ: Can ChatGPT Replace Human Brain? AI Tool Generating Content Comes With a Price

Doctors Can Use ChatGPT to Answer Patients

Since the pandemic made telemedicine popular, doctors have been overloaded with patient messages. Thus there is an urgent demand for solutions that increase efficiency and enhance services. For instance, a chatbot may write answers to patient inquiries that a doctor could amend.

According to the researchers, the current study should encourage further investigation into using AI assistants for communicating. There may be fewer unnecessary clinical visits if more patient queries are promptly, sympathetically, and to a high standard, freeing up resources for those who need them.

According to Anthony Cohn, a professor of automated reasoning at the University of Leeds in the UK, it would be risky to depend on any factual information provided by such a chatbot response given the propensity for chatbots to "hallucinate" and makeup facts. So, he recommends that an actual doctor check the chatbot's response.

In a previous report from Science Times, doctors discouraged people from seeking medical advice using ChatGPT as it tends to fabricate health statistics when asked about cancer information. A previous study revealed that the AI chatbot only got one correct answer out of 10 inquiries about breast cancer. Also, the correct responses were not as "complete" as those found in Google searches.

Dr. Paul Yi, a study co-author, encouraged the public to rely on their doctor instead of ChatGPT for medical advice. He added that the AI chatbot tends to make up fake articles to support its claims.

The study was published in JAMA Internal Medicine.

RELATED ARTICLE: AI (Artificial Intelligence) Bot GPT-3 Finished a 500-Word Academic Thesis

Check out more news and information on Technology in Science Times.