Structured Reporting Software Creates Less Complete and Accurate Radiology Reports than Free Text

November 16, 2009

WINSTON-SALEM, N.C. – As many software companies work to create programs that will give uniform structure to the way radiological test results are reported, a new study by researchers at Wake Forest University School of Medicine shows that such a system does not improve, but rather decreases the completeness and accuracy of the reports.

The study, published recently in Radiology, compared the accuracy and completeness of reporting test results in a free text, narrative format versus using standardized words and phrases from a pull-down menu (structured reporting).

 “This research is our attempt to evaluate a new technology that is a pretty hot topic in medicine right now and has been for a few years,” said Annette J. Johnson, M.D., M.S., an associate professor of radiology and lead investigator on the study. “Since radiology began, we have been creating our reports in a free text, narrative format. The rationale behind efforts to change this format is that all of the reports that we create could potentially be a very useful data base for clinical care and research if they were standardized.”

Standardization would mean that key content could be accessed through automated means, by computer systems, rather than requiring a human being to read the report and manually sift through narrative comments to try to find and categorize key content, Johnson said.

“This type of standardization is a very appealing idea, but we did not have data regarding what effect structured systems like this might have on individual report quality until now,” she added. “It turns out that a structured reporting system actually decreases the accuracy and especially the completeness of reports, which is the opposite of what we expected.”

This study provides the only known data about the proposed program’s effect on the quality of radiological reports, Johnson said.

 Currently, she explained, a physician might send a patient who is experiencing weakness in his arm, for example, for a head computed tomography (CT) scan to rule out concerns about a stroke. A radiologist then reads the CT scan and reports what she sees.

“I might say, ‘There’s no evidence of hemorrhage,’ or any other variety of wording to convey that idea. I could say, ‘no bleeding’ or ‘no hematoma’ or ‘no hyperdensity,’ all of which mean the same thing,” Johnson said. “In a structured system, I would choose from a list of standardized phrases with certain specific terms available in a dropdown menu, such as ‘No presence of acute stroke.’ Standardization seems simple, but it’s not always easy or what we commonly do in medicine. The theory is that structured reporting would make the reports intrinsically better because we’d all be using the same ideas recorded in the same verbiage instead of using numerous different ways to say ‘blood.’ Right now, several people reading a scan may all agree that they see the same thing, but each individual will say it in a different way.”

For the study, the researchers tested such a structured reporting system on two groups of residents. Each individual in both groups was given the same set of 25 brain magnetic resonance imaging (MRI) scans along with a video of a staff physician’s interpretation of the MRI scans, and was asked to report the interpretation using the free text narrative they were familiar with using. Four months later, the same set of MRIs was given separately to each individual again. Half of the residents were asked to report their observations in the free text narrative form as they had the first time. The other half were asked to create their reports using the structured reporting software, which listed standardized sentences and phrases describing different findings to choose from.

“We thought that the structured reporting group would make better reports,” Johnson said. “However, the reports created using structured reporting software were actually substantially less complete and a little less accurate compared to the reports made by the same residents in free text four months earlier and with the other group of residents who used free text both times.”

The company that made the specific structured reporting software used for the study is no longer in existence, Johnson said, but there are still many more software companies very focused on creating these programs and finding ways to structure reports. These companies, and the physicians who choose to use structured reporting systems, should strongly consider how the software is going to affect the quality of real patient records, and all software should be specifically tested for quality effects before being implemented, she said.

The study was funded in part by the General Electric-Association of University Radiologists Radiology Research Academic Fellowship (GERRAF).

 

Media Relations

Jessica Guenzel: news@wakehealth.edu, 336-713-4587

Bonnie Davis: bdavis@wakehealth.edu, 336-713-1597