How a Geisel Professor and Libraries Data Scientist Collaborated to Deepen Learning

Accelerate
Achievements
Elevate
November 17, 2023
from left Associate Professor of Medical Education Thomas Thesen and Dartmouth Libraries Data Scientist Simon Stone

from left Associate Professor of Medical Education Thomas Thesen and Dartmouth Libraries Data Scientist Simon Stone

The Challenge: to leverage ChatGPT for students to have a better learning experience while improving patient outcomes

Who from history would you chat with? Benjamin Franklin? Maya Angelou? Plato? This seemingly far-fetched concept is closer to reality than you may realize.

Helping make that possibility a reality is a group of collaborators from Hanover to Nairobi, including Geisel Associate Professor of Medical Education Thomas Thesen and Dartmouth Libraries Data Scientist Simon Stone.

The ever-evolving usability of generative artificial intelligence had Thomas wondering, "How can we leverage ChatGPT for our students to have a better learning experience while ultimately improving patient outcomes?" Answering this question presented a profound and beneficial shift in his curriculum and, thus, in his students' learning outcomes. 

Our entire society benefits from having better doctors; if this app can help even one person become a better healthcare provider, then it's worth it. - Nsomma Alilonu, Dartmouth Geisel School of Medicine student

Though Thomas had a concept in mind for how it would work, he needed someone to write the code and train the AI. While chatting at a DCAL event, Thomas mentioned his idea to Simon. They determined how to design the concept into a stable and deployable app with all the "bells and whistles."

Simon, who works in the Dartmouth Libraries' Research Data Services team, regularly collaborates with ITC's Research Computing group to address emerging uses of AI across Dartmouth. So, he knew just how to turn this concept into reality.

Thomas mentioned this proof of concept, or prototype, and I got stars in my eyes! His idea is exactly what this technology should do. It gives us a way to engage with students that was impossible before. It was so nice to see a faculty member go beyond thinking about preventing disruption and instead create something new using the technology. - Simon

The "MedSimAI," mimics the social environment and conversation flow in which patients and medical professionals would realistically find themselves. 

The prototype simulates a patient and physician interaction with the student acting as the medical professional and ChatGPT as the patient, with the ability to pull medical history, different tests, and other activities that would occur in a real-world situation." - Thomas 

Bringing Thomas' idea to its current Beta version wasn't possible without the support of additional co-designers and collaborators, including Jonathan Crossett from ITC and Nsomma Alilonu, a student at Dartmouth Geisel School of Medicine. Nsomma shared how,

It's been a very enlightening experience. I majored in Computer Science during undergrad, so I know what it's like to work with natural language processing models, but I've never seen one as capable as ChatGPT4 before. It's able to consistently answer questions, give feedback that makes sense, and talk much in the same way as a real person would without getting distracted or going off-topic.

Her role, thus far, has been working with Thomas on the experimental design, creating a feedback rubric based on what has been shared during the students' pre-clinical education, and sharing the student's perspective on the app's functionality. 

Thomas and Simon's ultimate goal is to scale the app and make it as openly accessible and equitable as possible. Their "fruitful collaboration," as Thomas calls it, isn't just for Geisel medical students; it includes a partnership with a Kenyan medical school in Nairobi, Aga Khan University Medical School - East Africa.

“This core idea of the large language model that acts as a player/actor so students can interact with the AI – this is the most transferable concept that can be applied to other departments across Dartmouth." Simon

So, how does it work? 

Thomas creates case descriptions and feeds those into the app, including information about all the necessary tests that would arise in a real-world situation, like ordering blood work. Depending on the disease or diagnosis, the AI-as-patient (chatbot) could have three or four different medical issues, so the student-as-physician converses with the chatbot to determine the diagnosis.

At the end of the simulated medical interview, the student submits what they believe is the diagnosis. The app analyzes the conversation and the student's diagnosis against a pre-designed rubric. The student receives individualized feedback - including measuring the human side of the process, such as how well the student displayed empathy. 

The app isn't a standalone solution for teaching students this vital aspect of medical care. It's a complementary tool in their learning that provides extensive feedback.

I believe that the patient interview is one of the most important skills that we learn in our medical education: if you don't ask the right questions in the right way, you can give the wrong diagnosis, waste a patient's time and money, and or create distrust.

While I don't think all the nuances of talking to a real person can be perfectly captured using AI, consistent practice with a tool that is thoughtfully designed to make the experience as close to the real one as possible may be one way to build the habits necessary to take a good patient history. Our entire society benefits from having better doctors; if this app can help even one person become a better healthcare provider, then it's worth it. - Nsomma

When asked, "What's next?" Thomas stated that they're still learning based on student usage and testing for what goes right and wrong. He hopes to have board-certified neurologists contribute to the assessment process. They're also about to launch with a speech-to-text capability where the student speaks to the app, and are currently testing text-to-speech where ChatGPT responds with audible speech. The text-to-speech construct they're using sounds natural (here are some examples), though in a somewhat neutral, narrator-esque speech style. The result may be a bit uncanny as the "patient" describes symptoms of their ailments.

Development and testing are underway to integrate images and other records into the app as an extra layer to the student learning experience. For example, if the case file describes a dermatologic issue, there will be a cache of images related to that issue. The app then presents the image when the student-as-physician asks for it. These enhancements broaden the scope of learning and elevate medical students' education.

While scaling the app is one of their goals, first and foremost, this tool is for the Dartmouth community and the student and faculty collaborators in Kenya.

If you have your own API key or a DARTMOUTH login, the "MedSimAI" is ready to use. 

It’s been an amazing experience working on this project. As programmers, we need to grapple with the fact that large language models can completely change how we solve problems and truly push the boundaries of what is possible.

Simon Stone
Back to top