This information is current as of the date of original publication or update. It may have changed by the time you read this. I invite you to fact-check what you read here.
This information is not intended for diagnosis or treatment. Before making health decisions, discuss with your physician or other qualified healthcare provider to decide what is right for you.
I recently had a medical procedure and when the results came into my patient portal, I logged in to read the results. The diagnosis was something I don’t have much professional experience with, so I did what many patients to with their medical information. I went online and searched for information.
Although this time I didn’t stop with a search engine. I used an AI assistant and was amazed at how easy it was not only to read information but also to ask questions and receive an answer. It even suggested additional resources and next steps.
Then I found this article on KFF indicating that I am not alone. And probably many of you have already used AI for this and other purposed. So I am sharing it here.
An AI Assistant Can Interpret Those Lab Results for You
(Edited for readability and length.)
written by Kate Ruder, September 15, 2025
When Judith Miller had routine blood work done in July, she got a phone alert the same day that her lab results were posted online. So, when her doctor messaged her the next day that her overall tests were fine, Miller wrote back to ask about the elevated carbon dioxide and low anion gap listed in the report.
While the 76-year-old Milwaukee resident waited to hear back, Miller did something patients increasingly do when they can’t reach their health care team. She put her test results into Claude and asked the AI assistant to evaluate the data.
Medical Records plus AI equals Understanding
“Claude helped give me a clear understanding of the abnormalities,” Miller said. The generative AI model didn’t report anything alarming, so she wasn’t anxious while waiting to hear back from her doctor, she said.
Patients have unprecedented access to their medical records, often through online patient portals such as MyChart. Federal law requires health organizations to immediately release electronic health information, such as notes on doctor visits and test results.

A study published in 2023 found that 96% of patients surveyed want immediate access to their records, even if their provider hasn’t reviewed them.
And many patients are using large language models, or LLMs, like OpenAI’s ChatGPT, Anthropic’s Claude, and Google’s Gemini, to interpret their records.
Use AI for Health Cautiously
That help comes with some risk, though. Physicians and patient advocates warn that AI chatbots can produce wrong answers and that sensitive medical information might not remain private.
Yet, most adults are cautious about AI and health. Fifty-six percent of those who use or interact with AI are not confident that information provided by AI chatbots is accurate, according to a 2024 KFF poll. (KFF is a health information nonprofit that includes KFF Health News.)
“LLMs are theoretically very powerful and they can give great advice, but they can also give truly terrible advice depending on how they’re prompted,”
Adam Rodman, internist at Beth Israel Deaconess Medical Center
Adam Rodman Is an internist at Beth Israel Deaconess Medical Center in Massachusetts and the chair of a steering group on generative AI at Harvard Medical School.
Justin Honce, a neuroradiologist at UCHealth in Colorado, said it can be very difficult for patients who are not medically trained to know whether AI chatbots make mistakes.
“Ultimately, it’s just the need for caution overall with LLMs. With the latest models, these concerns are continuing to get less and less of an issue but have not been entirely resolved,” Honce said.
Rodman has seen a surge in AI use among his patients in the past six months. In one case, a patient took a screenshot of his hospital lab results on MyChart then uploaded them to ChatGPT to prepare questions ahead of his appointment.
Rodman said he welcomes patients’ showing him how they use AI, and that their research creates an opportunity for discussion.
Roughly 1 in 7 adults over 50 use AI to receive health information, according to a recent poll from the University of Michigan, while 1 in 4 adults under age 30 do so, according to the KFF poll.

Should AI Give Medical Advice?
Using the internet to advocate for better care for oneself isn’t new. Patients have traditionally used websites such as WebMD, PubMed, or Google to search for the latest research and have sought advice from other patients on social media platforms like Facebook or Reddit.
But AI chatbots’ ability to generate personalized recommendations or second opinions in seconds is novel.
Liz Salmi, communications and patient initiatives director at OpenNotes, an academic lab at Beth Israel Deaconess that advocates for transparency in health care, had wondered how good AI is at interpretation, specifically for patients.
In a proof-of-concept study published this year, Salmi and colleagues analyzed the accuracy of ChatGPT, Claude, and Gemini responses to patients’ questions about a clinical note. All three AI models performed well, but how patients framed their questions mattered, Salmi said
For example, telling the AI chatbot to take on the persona of a clinician and asking it one question at a time improved the accuracy of its responses.
Are Medical Records Private with AI?
Privacy is a concern, Salmi said, so it’s critical to remove personal information like your name or Social Security number from prompts. Data goes directly to tech companies that have developed AI models, Rodman said, adding that he is not aware of any that comply with federal privacy law or consider patient safety
Sam Altman, CEO of OpenAI, warned on a podcast last month about putting personal information into ChatGPT.
“Many people who are new to using large language models might not know about hallucinations,” Salmi said, referring to a response that may appear sensible but is inaccurate.
For example, OpenAI’s Whisper, an AI-assisted transcription tool used in hospitals, introduced an imaginary medical treatment into a transcript, according to a report by The Associated Press.
Using generative AI demands a new type of digital health literacy that includes asking questions in a particular way, verifying responses with other AI models, talking to your health care team, and protecting your privacy online, said Salmi and Dave deBronkart, a cancer survivor and patient advocate who writes a blog devoted to patients’ use of AI.

Can AI help physicians communicate results?
Patients aren’t the only ones using AI to explain test results. Stanford Health Care has launched an AI assistant that helps its physicians draft interpretations of clinical tests and lab results to send to patients.
Colorado researchers studied the accuracy of ChatGPT-generated summaries of 30 radiology reports, along with four patients’ satisfaction with them. Of the 118 valid responses from patients, 108 indicated the ChatGPT summaries clarified details about the original report.
But ChatGPT sometimes overemphasized or underemphasized findings, and a small but significant number of responses indicated patients were more confused after reading the summaries, said Honce, who participated in the preprint study.
Meanwhile, after four weeks and a couple of follow-up messages from Miller in MyChart, Miller’s doctor ordered a repeat of her blood work and an additional test that Miller suggested. The results came back normal.
Miller was relieved and said she was better informed because of her AI inquiries.
“It’s a very important tool in that regard,” Miller said. “It helps me organize my questions and do my research and level the playing field.”
KFF Health News is a national newsroom that produces in-depth journalism about health issues and is one of the core operating programs at KFF—an independent source of health policy research, polling, and journalism. Learn more about KFF.
Subscribe to KFF Health News’ free Morning Briefing.
This article first appeared on KFF Health News and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.![]()
This article also appeared on NPR.ORG.
AI in Medical Education
I graduated from college, medical school, and residency without using a computer for training or patient care. When computers and the internet came into widespread use, medical education and medical practice changed to embrace that new technology.
Now AI is doing the same thing. Here you can learn how Harvard Medical School is building artificial intelligence into the curriculum to train the next generation of doctors.
Reader Reflection and Response
I hope you have learned something new in this post, or it confirms something you already knew.
How are you using technology to manage your healthcare? Do you use options such as
- health record portals
- online scheduling
- video visits
- internet search for medical information, with or without AI
Were you aware that doctors are using AI to create office notes and result summaries?
What concerns do you have about the value and safety of using AI in healthcare? What else do you need to know or want to learn about AI use ?
Cover Image
The cover image was created by the Jetpack AI Assistant from Automatic, Inc.
Exploring the HEART of Health
I hope the information in this post gives you inspiration.
I’d love for you to follow this blog. I share information and inspiration to help you turn health challenges into health opportunities.
Add your name to the subscribe box to be notified of new posts by email. Click the link to read the post and browse other content. It’s that simple. No spam.
I enjoy seeing who is new to Watercress Words. When you subscribe, I will visit your blog or website. Thanks and see you next time.

Dr Aletha

I appreciate this informative article Aletha. I use MyChart too. I’ve never used AI to look up medical information but I’ve used it to look up various other things. Usually if I type a question into Google the first thing on the list is AI generated then it goes to all the other articles on said topic or question. So I can relate to what you’re saying since I can reference the AI response then go on to the other articles listed.
Thanks so much for sharing with Sweet Tea & Friends this month dear friend. I’m so happy you’re here.
LikeLike
Thanks Paula.I’m glad to know you find it helpful. I think what a helpful tool AI would have been when I was still in practice. My doctor told me it saves him time, he gets home to his family sooner now .
LikeLiked by 1 person
I am relatively new to using AI. I had no idea you could use it with medical test results. I do use and utilize MyChart on a regular basis. I can get a quicker answer that way rather than calling and leaving a message with my doctors. A dentist visit a few months ago left me a little wary, though. My dentist came in and said they’d had AI read my chart. It determined everything was ok. The thing is there had been no x-rays or any other tests done that could have been read other than what was already in my chart. I’m actually thinking of leaving this dental practice as it’s now corporate owned, and the staff is constantly changing. I think there is a place for AI in our lives, but I’m still a little wary of it.
https://marshainthemiddle.com/
LikeLiked by 1 person
My doctors also use MyChart. I am amazed at how it has changed doctor-patient communication, in some ways better, but not all. I am appalled a dentist would just accept an AI summary as true without reviewing it personally. I hope it’s because they think AI is 100% accurate and not because they are neglecting their responsibility to patients. We all have a lot to learn about AI and I hope we do before someone gets hurt.
LikeLike
Great post! I would say that this post confirmed what I already thought. AI can give you the information you are searching for but also to keep in mind that the info might not be 100% accurate.
LikeLiked by 1 person
Yes Catherine that’s true. It’s like any other source of information, whether virtual or a live. You have to trust the source and the information. Thanks for your input.
LikeLike