Poster
PI-036
Developing a Responsible AI-Powered Clinical Intelligence Solution in Wound Care
Introduction:
We are now in the third epoch of artificial intelligence (AI). This era of generative AI introduces new capabilities and risks, such as hallucinations, where AI generates incorrect or fabricated information(1). Generative AI alone should not be used for individual diagnoses/treatment plans due to potential patient harm and increased malpractice liability. In clinical decision-making, typical AI achieves only a 50% success rate, often missing life-threatening conditions(2,3).To address these issues and help clinicians find reliable answers in wound care, we aimed to develop a responsible AI-powered clinical intelligence solution.
Methods:
Using the Design Thinking methodology(4), the solution was developed as a module within a decision support platform with a continually updated, evidence-based knowledge base (KB).
● Four AI models were tested: one open-AI model and three programmed to retrieve, aggregate, and generate answers strictly based on the KB.
● Four wound clinicians compiled 50 frequently-asked questions and manually extracted answers from the KB to create a benchmark.
● Answers to the 50 questions provided by each AI model were evaluated against the benchmark, and accuracy scores were calculated.
● The two AI models with the highest accuracy scores were selected and calibrated until one achieved a 100% accuracy score, which was then incorporated into the solution.Results:
The module is a mobile-responsive solution that leverages AI-powered search and generative answers that are vetted, constantly updated, evidence-based, and reliable. Clinicians can quickly find trusted answers by selecting frequently-asked questions from the prompt library. The AI model retrieves and aggregates data from the KB to generate answers. For details, clinicians can refer to the sources used by the AI model via provided links or ask follow-up questions. The prompt library expands as new questions are received. Unlike open-AI models, this solution curbs AI hallucinations, providing referenced answers vetted by wound clinicians, built on an evidence-based, continually updated KB.
Discussion:
A responsible AI-powered clinical intelligence solution for wound care was successfully developed. This solution enhances clinical decision-making by delivering reliable information at the point-of-care. Generative AI should supplement, not replace, human judgment, requiring responsible use/robust oversight to complement clinical expertise.
References
1. Howell MD, Corrado GS, DeSalvo KB. Three epochs of artificial intelligence in health care. JAMA. 2024 Jan 16;331(3):242–4.
2. I’m an ER doctor, here’s how I’m using ChatGP to help treat patients [Internet]. [cited 2023 Jun 4]. Available from: https://www.fastcompany.com/90895618/how-a-doctor-uses-chat-gpt-to-treat-patients
3. ChatGPT in the emergency room? The AI software doesn’t stack up [Internet]. [cited 2024 May 31]. Available from: https://www.fastcompany.com/90863983/chatgpt-medical-diagnosis-emergency-room
4. Ferreira FK, Song EH, Gomes H, Garcia EB, Ferreira LM. New mindset in scientific method in the health field: Design Thinking. Clinics. 2015 Dec 10;70(12):770–2.