News
Next Story
Newszop

The AI doctor won't see you now as study finds chatbots give inaccurate drug advice

Send Push
image

Patients have been warned not to turn to AI chatbots and search engines for advice about medicines, after a study found their advice was often wrong.

Researchers asked Bing copilot - developed by Microsoft - 500 questions about the 50 most commonly prescribed drugs in the US.

These included queries about what the medicines were used for, how they worked, usage instructions and common side effects.

Although the chatbot often provided complete and accurate data, a worrying number of answers were deemed incorrect or potentially harmful.

The researchers said: "We observed that search engines with an AI-powered chatbot produced overall complete and accurate answers to patient questions.

READ MORE:

image

"However, chatbot answers were largely difficult to read and answers repeatedly lacked information or showed inaccuracies possibly threatening patient and medication safety."

Detailed analysis of 20 answers by a panel of seven drug safety experts found only 54 per cent aligned with scientific consensus. Some 39 per cent went against the scientific consensus and there was no established scientific consensus for the remaining six per cent.

Possible harm resulting from a patient following the chatbot's advice was highly likely in three per cent and moderately likely in 29 per cent of those answers.

The researchers, from University of Erlangen-Nuremberg in Germany, also noted that the chatbot's answers were often complex and would require a degree-level education to understand.

Their report in the BMJ Quality & Safety journal said: "Despite their potential, it is still crucial for patients to consult their healthcare professionals, as chatbots may not always generate error-free information.

"Caution is advised in recommending AI-powered search engines until citation engines with higher accuracy rates are available."

Chatbots are based on large language models which are trained on extensive datasets from the whole internet, enabling them to generate information on any topic including healthcare.

The Bing chatbot cited more than 200 websites as sources of information, most frequently referencing drugs.com, mayoclinic.org and healthline.com.

A Microsoft spokesperson said: "Copilot answers complex questions by distilling information from multiple sources into a single response.

"Copilot provides linked citations to these answers so the user can further explore and research as they would with traditional search.

"For questions related to medical advice, we always recommend consulting with a healthcare professional."

Loving Newspoint? Download the app now