app-store-logo
play-store-logo
March 12, 2026

Oxford University Study Reveals ChatGPT Inaccurate for Medical Advice

The CSR Journal Magazine

A study from the University of Oxford has indicated that relying on chatbots for medical advice may lead to inaccurate health recommendations. The research, published in the journal Nature Medicine, involved nearly 1,300 individuals who were tasked with identifying health conditions and advising on potential actions based on specific medical scenarios. The results highlighted significant gaps in the interaction between users and AI. Notably, the quality of answers generated by AI was comparable to information obtained through standard internet searches.

How the Research Was Conducted

The researchers divided participants into two groups, each receiving detailed medical scenarios. Examples included situations involving a young man experiencing severe headaches after a night out and a new mother feeling persistently out of breath. One group interacted with AI to assess these scenarios, while the other was allowed to use any method, such as internet searches. Evaluations were conducted to determine how accurately participants identified potential health issues. The findings showed that those who engaged with chatbots did not demonstrate improved decision-making compared to those who relied on traditional methods. AI responses varied significantly based on how questions were framed, resulting in a mix of accurate and misleading information.

The Importance of Rigorous Testing

Study author Adam Mahdi, an associate professor at the Oxford Internet Institute, emphasized the necessity for thorough testing of AI systems in healthcare, akin to clinical trials for medications. Despite the advancements in AI, these systems require comprehensive evaluations with a diverse range of users to fully understand their reliability in critical healthcare settings.

Interaction Challenges with AI Systems

The research indicated that participants often lacked awareness regarding the specific information needed for AI to provide appropriate advice. In traditional medical consultations, doctors gather crucial details by asking targeted questions. This interaction is frequently absent when patients turn to chatbots for initial guidance. Interestingly, Dr. Pakhhe Aggarwal, a gynaecological oncologist at Apollo Hospitals in New Delhi, illustrated a case where a patient initially consulted ChatGPT regarding suspected endometriosis. She remarked that such a diagnosis could stem from merely a few symptoms, demonstrating the limitations of AI in providing accurate health assessments.

Limitations in Patient Interaction

Dr. Aggarwal pointed out that AI’s capabilities are insufficient for delivering definitive diagnoses based solely on limited symptoms. Physicians consider a patient’s medical history, specific symptoms, and follow-up questions to arrive at an accurate diagnosis. Similar to real doctor-patient interactions, users of chatbots often select what information to disclose. Many users do not understand the importance of providing comprehensive details, leading to misguidance from AI systems. An example from the study highlighted that when a participant described a scenario involving gallstones without including critical data such as pain location, severity, and frequency, the chatbot could not offer an accurate diagnosis.

Dangers of Misdiagnosis from AI

The interactions between users and AI present significant challenges. The lead author, Andrew Bean, expressed hope that this study would aid in the development of safer and more effective AI tools. It is essential for patients to recognize that consulting a large language model like ChatGPT concerning their symptoms may be hazardous, as it could lead to incorrect diagnoses and fail to identify situations where urgent medical attention is necessary. Ultimately, the study concluded that none of the evaluated language models is yet suitable for direct patient care.

Long or Short, get news the way you like. No ads. No redirections. Download Newspin and Stay Alert, The CSR Journal Mobile app, for fast, crisp, clean updates!

App Store –  https://apps.apple.com/in/app/newspin/id6746449540 

Google Play Store – https://play.google.com/store/apps/details?id=com.inventifweb.newspin&pcampaignid=web_share

Latest News

Popular Videos