Crafting the ideal patient progress note, at least judging from the literature, seems more easily achieved in theory than in execution. Since the late 2000s, when the electronic health record replaced pen and paper in chronicling a hospital patient’s story, from entry to exit, senior physicians have struggled with house staff to make entries into the patient’s record that actually mirror the patient’s progress.
Various articles over the past 15 years have decried the condition of these notes, finding they contained unnecessary information and many inaccuracies — primarily because the note authors had copied and pasted prior input; this keyboard capability has been dubbed an “egregious danger.”
“These notes not only fail to reflect the orderly progression of thought and action…but also rarely provide clear documentation of the day’s events,” authors of one study in The American Journal of Medicine wrote in 2009. About 5 years ago, the American College of Physicians chimed in: “At present, it is far too easy to open a patient chart, read volumes of data, and find that no single person has stated what they believe is happening.”
Other articles have attempted to create audit tools or best practice recommendations. (“Pulling in a full list of current medications is unlikely to improve the value of the note and may become outdated quickly depending on the timing of when a note is written.”) In 2019, JAMA Network Open published a California hospitals-based study that showed the significant disconnect between the number and type of exams that a group of nine emergency department trainees recorded as having completed and the actual number and type of exams that 12 observers witnessed either via direct audiovisual observation or by a review of the recorded observation. The agreement between what was actually recorded and observed was barely 50%. The observers, however, jibed with each other 90% of the time.
Contributing to the progress notes’ bloat factor is how the hospitals use them, namely for coding, billing, and quality reporting. These entries make them legal documents and therefore the justified recipient of all non-care patient information. This extra information often obfuscates the point of the note’s existence.
And it most likely negatively affects the person in the bed. “Poor notes negatively impact patient care,” wrote the authors of a new study conducted at Johns Hopkins, Baltimore, and published in the Journal of General Internal Medicine.
Bottom line: It is finding the right balance — combining seasoned staff’s clinical experience with the nearly innate tech know-how of incoming juniors — that has continued to elude all.
Creating Readable, Usable Progress Notes
Refusing to accept that, Johns Hopkins’ Samuel Durso, MD, MBA, executive vice chair, Department of Medicine, Johns Hopkins University, and director, Department of Medicine, Johns Hopkins Bayview Medical Center, decided on a new tack for his study. He said in a recent interview that he had had enough of seeing poorly devised notes.

“Most of the time, you can’t tell what is going on,” said Durso.
The Durso team took a different path for getting seasoned hospitalists to create readable, usable patient progress notes. They created a rubric with 15 gradable items, devised an educational intervention that was used when necessary, and included a control group. The total number of hospitalist participants was 26. The participants were all about 40 years old and had worked in hospital medicine for an average of about 9 years.
The study team rated 156 of the study participants’ notes, taken 6 months prior to the educational instruction and after. The creation of the rubric was based on a literature review, 100 progress notes, the team’s assessment of existing admissions notes, and peer feedback. The study results showed that progress-note quality improved if hospital physicians were guided via the rubric. P-value scores were significant in 8 of the 15 graded items.
Hospital physicians do not deliberately create poor progress notes, Durso said. “It’s not because people are stupid or lazy; it’s because hospital systems are set up a certain way,” he said.
As for what happens next, Durso said he is thinking about ways to maintain gains from the study results by “baking” them into the actual workday.
“We don’t have that answer yet,” he said. “It takes people to do this and train them.” One option could be to use Johns Hopkins’ existing artificial intelligence software to incorporate note quality.
He addressed the size of his study. “We didn’t need this study to tell us we have a problem,” he said. His team wanted to see if it could design a plan that was workable in the Johns Hopkins environment “to give us an indication of whether the teaching method is effective.”
The study also provided the methodology some credibility. “At least what we have shown is we can train people to improve their notes, with adjudication between observers.”
Source link : https://www.medscape.com/viewarticle/patient-progress-notes-often-fail-they-can-be-fixed-2025a1000hdv?src=rss
Author :
Publish date : 2025-06-30 08:45:00
Copyright for syndicated content belongs to the linked Source.