Quality Control With A Custom Fit

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

Less than a year ago, the radiology department at the Fletcher Allen Medical Center (FAMC), Burlington, Vermont, the hospital affiliate of the University of Vermont College of Medicine, was struggling with antiquated peer-review and quality-control (QC) methods. For peer review, according to Steven P. Braff, MD, radiology department chair, radiologists were relying on paper cards that had to be filled out by hand. Busy physicians balked at the added work, even though peer review is mandated by the Joint Commission and called for by the ACR®.

imageSteven P. Braff, MD

Moreover, Braff says, the FAMC radiology department had no way to formalize quality reviews of radiologic technologists. If radiologists noticed mistakes in a technologist’s imaging, they had to find a supervisor and follow a cumbersome reporting procedure.

There was a similar burdensome procedure for handling mistakes made by radiologists themselves. In this instance, Braff says, the radiologist who spotted a possible interpretation error had to take the awkward step of questioning the work of a fellow physician.

Now these missed-materials cases can be handled electronically by any radiologist spotting what he or she thinks is a mistake. The alleged mistake can be identified in the PACS, and the case can either be handled at once or (if the mistake is noncritical or the case is nonactive) sent for review at a missed-materials conference that the radiology department holds monthly, Braff says.

The same software application that allows missed materials to be handled electronically using FAMC’s PACS also generates electronic peer reviews and electronic reviews of technologists, all by using drop-down menus and simple click responses, with an option for written comments.

This electronic quality-review capability has given the radiology department a much better handle on QC, Braff says. It’s fast and easy for radiologists to do peer reviews or respond to errors with a few mouse clicks. He estimates that since installing the QC software on the PACS, the department has quadrupled its volumes of peer reviews, technologist reviews, and missed-materials reports.

“If you tell physicians reading cases to pick up a pen and fill out a card, they just are not going to be able to do it nowadays, as busy as physicians are,” Braff says, “but with this, there’s no excuse not to do it. It’s really easy.”

To meet its peer-review and other quality- control needs, FAMC turned to peerVue (Sarasota, Florida), a QC software vendor that also offers consultation and customization. The latter turned out to be handy because FAMC wanted a range of custom features in its electronic QC programs.

FAMC is licensed for 562 beds and has more than 500 faculty physicians, including 35 radiologists on staff and an additional 30 radiology residents and fellows. The radiology department completes about 300,000 imaging studies per year, Braff notes.

Like many other faculty physicians, Braff holds multiple appointments. In addition to being department chair, he is professor of radiology and professor of neurology and neurological surgery. Braff is a hospital employee, as are the other FAMC radiologists. In addition to reading directly for the hospital, they read for other facilities with which the hospital has contracted. These contracts can be for preliminary and final interpretations, as well as overreads (including night-coverage services). FAMC itself does not use outside night coverage, Braff says.

Peer Review

The peer-review application that FAMC uses is more or less what peerVue offers out of the box. For this software, peerVue gets an enterprise fee. Braff describes the peerVue package as nearly identical to the peer-review categories used by the ACR. “I believe it’s exactly the same; it certainly passes muster,” he says.

Of course, the radiology department has to decide how to use the software. One of FAMC’s decisions, Braff says, has been to forgo protocols and, instead, to encourage radiologists to do peer review by using a first-case-of-the-day format.

“I like them to review the first case that they read,” Braff says. “Just click on the prior study and drop down the peer review. Look at the old report and see if it’s the same as your opinion. That’s the random piece off the worklist; they don’t know what they’re going to read..”

In the peer-review window, Braff says, the radiologist can click on “agree,” or on alternatives such as “diagnosis not ordinarily expected” or