Skip to main content
Digital Education Studio

Can we use Generative AI to support the clinician through training and in the real world?

Stuart Miller, Senior Lecturer In Sports and Exercise Medicine, William Harvey Research Institute and Ian Griffiths, Lecturer in Sports & Exercise Medicine, Institute of Health Sciences Education

Stuart and Ian share their journey with using generative AI for clinical teaching, from curiosity to ideation and implementation, in two projects, including one PPFEE funded.

 

Like many people, the recent availability of ChatGPT excited us both. We had both utilised AI-related tools in our academic and personal lives, but this was something that created an opportunity at a higher level. Generative AI was now more accessible, and we were excited about how we could use it in our academic practice.

We were obviously sensing everyone’s apprehension as people repeated “This will increase the rate of students cheating”. Academic integrity is a valid concern, and we’ve tried to see how well ChatGPT is at marking and providing feedback on short answer questions; obviously we didn’t use it to mark our actual exams. Providing ChatGPT the exam questions, marking criteria and then asking it to mark and feedback on some example questions was both scary and exciting – it was accurate, and the feedback was really good!!!

The concern with AI didn’t deter us from exploring the potential applications of tools like ChatGPT further. Over WhatsApp and e-mail, we started to share prompts and outputs that we had tried, and the potential ideas we had for more formalised projects, two of which have begun this year.

Supporting the clinician

People have regularly enlisted the guidance of “Dr Google” to give them advice about their condition or symptoms. Regularly, this ends up with either anecdotal suggestions based on no evidence, and/or exasperating the extent of the condition – focusing on the most unlikely, but most severe, outcome.

We will naturally flock to large language generative AI models, like ChatGPT and Google Bard (now Gemini), to replicate the “Dr Google” approach – but are they to be seen with the same fear by clinicians, or can they be a useful tool to support patients?

With some of our MSc students - who are exploring this idea as part of their dissertation- researching the extent to which tools like ChatGPT can be used to advise people outside of the clinical setting. How does the information provided compare to clinicians in terms of accuracy and empathy? How does it match up with the current best evidence-based practice?

Despite the real-world patient-clinician supporting theme to the project, there is a large underpinning pedagogical reasoning for us running this. It goes back to the apprehension and fear we were hearing when ChatGPT first became well-known. We are eager to see how well these tools work in providing understanding and knowledge on specific clinical topics.

Can students use these tools to “cheat” on assessments, or do they lack in the detail, reasoning, and empathy that we challenge our students to master? A “no” would make this easier to handle. But it’s the potential “yes” answer we are more excited about.

Having a tool that allows students to interact with away from class and their peers, that provides a more communicative and interactive environment alongside providing the high level of information and guidance needed, is something to be excited about. It’s this outcome that opens a whole new level of opportunity in providing a learning environment that can be tailored to the individual student.

One potential is the parallel project we are working on…

GenAIrating critical thinking

OSCEs - Objective Structured Clinical Examination - are a core assessment strategy in all medical domains. The opportunity to demonstrate competency in communication and clinical reasoning is core to these. But outside of the supervised setting, it’s very tough to really practice these skills.

A tool that gives students the ability to practice these interactions, where they are challenged to think critically, systematically, and communicate openly with empathy, would be indispensable. Being able to then reflect on this process, exploring the journey the interaction would be priceless.

Like many things, we thought we could use Google Bard, as we had already used ChatGPT and thought a change was good. “I'm a medical student specialising within sport and exercise medicine. I want to practice a consultation with a patient. You are to be the patient. You have a musculoskeletal injury, and this is your first appointment. You are to take on the role of the patient, and I will be the clinician. We will have a conversation as if we are at your first appointment. Do you understand?

It was not a very well thought out prompt, but we just went with it. We ended up chatting with a young individual who had recently been having pain in their right knee. Without any guidance as to the injury or mechanisms of injury, what came was a “consultation” that reflected what we see regularly in the clinical setting. The “patient” (they called themselves Bard) was like any real-world patient. Bard demonstrated apprehension and frustration, even catastrophising at some points. They challenged us to be empathetic whilst focusing on the clarity of our message.

What also came out of it was a transcript of this consultation. Imagine a student being able to read back through this, discuss it with other students, even show it to their tutor, to explore self-reflection and critical appraisal of how they performed. Now imagine this progressing to non-text based through utilising voice input/output. Imagine if this was free and accessible to all. We now have the tools that give us the potential to do this.

We have since gained PPFEE funding to explore this route even further, alongside a student steering group. We are currently focusing on clinical settings seen within the Sport and Exercise Medicine domains – simply because this is the domain, we work in. The potential for extrapolating what we develop to other domains is easy to see.

Any setting where a student - in their future profession -would interact with or consult another human being, whilst thinking critically and systematically, would be able to be recreated, providing potentially endless opportunities to practice and develop critical thinking and reflection.

Back to top