Taking QA to the Next Level
Medicine in general is evolving toward a patient-centered, outcomes-based model, and radiology should be no exception, according to Timothy Myers, MD, senior vice president and CMO of NightHawk Radiology Services, Scottsdale, Arizona. “We’re looking at things less from the standpoint of the physician and more from the standpoint of the patient, and medicolegally, that’s where it’s always been,” Myers observes. “Everyone’s moving toward an outcomes-based approach. If our patients don’t get better, or experience an issue with the care we give them, we need to reevaluate.” Quality assurance (QA) in radiology has customarily focused on how difficult an abnormality is to detect, Myers says; for example, the ACR’s RADPEER™ program, founded in 2002, ranks discrepancies based on how often radiologists should make the diagnosis, with RADPEER level 4 findings defined as those for which a given diagnosis should be made nearly every time that they are encountered. The flaw of this system is its subjectivity, according to Myers. “Every radiologist misses RADPEER 4s. It’s easy to sit back and say the diagnosis should be made almost every time, but if you’re in a busy practice, you’re not reading quietly in a dark room. Given real world circumstances, all bets are off as to whether a finding ‘should’ be seen,” he says. Traditional QA is also undermined by its structure, Myers notes. “Nobody really wants to deal with it,” he says. “Nobody wants to be in the position of telling colleagues they missed something or didn’t do the right thing, because everyone knows how thin the glass is on his or her own house. Historically, you had a process where you looked at cases here and there, but it was very anecdotal—nothing systematic.” The advent of RADPEER improved the process by disseminating a universal form and structure. “RADPEER was a huge step in the right direction,” Myers says. “It brought a systematic approach to QA, and it enabled radiologists to hear from someone at a higher level than that of a peer about what they need to change and how they can educate themselves.” Now, Myers says, it’s time to take QA to the next level by repositioning its focus, making patient outcomes the ultimate determinant of quality. That’s the approach that NightHawk has taken with its own QA program, in which discrepancies are rated according to their impact on patient care: Category 4 will have an immediate impact on how the patient is treated, while category 3 will require follow-up action at some point. “Looking at it from this standpoint is much more cut and dried,” Myers says. “We try to make it as objective as possible, though of course there are always gray areas. We want to change QA from a subjective process to an objective one.” Radiologists’ interpretations should also be evaluated according to their usefulness to the clinicians who order them, Myers adds. “Traditional QA only looks at the interpretation from the perspective of right or wrong,” he says. “The next step is evaluating not only report accuracy, but report quality. The information getting back to the clinician has to be both accurate and effective. We need to be evaluating more fully how we are interacting with our medical colleagues and the care they give.” Aligning radiology QA with patient outcomes will yield benefits for the profession, Myers predicts, by promoting the value of the specialty. "Radiologists are always being criticized for not being involved enough in the care of patients," he notes, "but we're very involved: A hospital without a good radiology program is a hospital in very serious trouble. We just need to stop being so introspective. I believe that changing the focus of our QA will go a long way toward bringing us closer to the patient." Toward this end, NightHawk has developed its own QA and Client Peer Review programs, both designed to be outcomes-based. "Many QA and peer review programs aren't as effective as they could be," Myers notes. "Focusing on the patient outcome as well as the interpretation improves care." In a health-care environment where hospitals increasingly seek to cut costs while maintaining quality, Myers believes this renewed focus on patient outcomes could also rescue radiology groups from perceived obsolescence. "If we don't get involved in medical care in a real way, we are replaceable," he says. "If you're not value added, relevant, and effective, your position with your hospital is in jeopardy. In a number of cases where radiology groups have lost their contracts, their clinical colleagues haven't come to their rescue, but if you're relevant-if you're participating in medicine and being effective in your communication and care-when the hospital has an issue, your colleagues will come to your defense."