The CMO–CIO Partnership: Improving clinical quality through operational efficiency, Part II

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
 - handshake

Central to vRad’s mission are two concepts currently driving healthcare reform: improving clinical quality and achieving operational efficiency. Through the exceptionally tight partnership of vRad Chief Medical Officer, Ben Strong, MD, and Chief Information Officer, Shannon Werb—explored in Part I of this interview—vRad has achieved a synergistic melding of those concepts that truly are driving clinical innovation, as evidenced by the discussion with Strong and Werb continued below.

Q. Let’s talk about clinical quality.  Where are we in the continuum of leveraging IT, and where do we need to go?

Strong: In the radiologic literature, administrative entities in radiology often try, I think, to obfuscate the issue of quality.  

Quality in radiology simply comes down to accurate interpretation.  It doesn’t need to be pushed and shoved into a mathematical equation with multiple and complex variables.  In fact, quality is just not nuanced in the way that some in our specialty would like us to believe.  At vRad, quality is a critical, yet simple issue: is there a significant abnormal finding on the study and did the radiologist identify it and report it — that’s it. 

We are in a particularly advantaged position with regard to quality at vRad.  First, our practice grew from providing preliminary studies, all of which are over-read by a second radiologist the following day. That means a second set of eyes on all of our reports—and direct and immediate feedback on the quality of our interpretations.  In addition, we offer a very responsive quality assurance committee and an online portal for our client sites to submit any perceived discrepancies in interpretation for both preliminary and final reads—our technology tools allow us to process those in an efficient manner and turn the information into insight and data that we use internally to improve our clinical performance—and to share with clients to ensure transparency.

Q. Would you describe how that works?

Strong: Highly trained QA committee members interact with vRad’s integrated QA module, allowing them to review cases with discrepancies and make sure that the radiologists involved review the cases as a first step. Committee members compile all of the input from multiple sources and ultimately code the case according to a variety of parameters so that, just at a glimpse, we can see exactly what the essence of that discrepancy was and develop a process to address it specifically for the impacted client—and in general for our overall practice management. 

Based on this initial model, we have compiled that data and leveraged it for a decade. On the clinical side, we use it to direct educational efforts for the radiologist, both individually and across the practice. We look at workflow elements and other factors that might be contributing to quality: time of day or night, beginning or end of shift, specific modality types.  

All of those things are available to us and allow us to customize and tailor our educational and efficiency advancements to address perceived gaps in quality.

Q. Who provides the continuing education—who reaches out to the individual radiologists?  Is the process automated or manual?

Strong: It's highly individualized and human, but it is enabled by technology.  When a client site submits a discrepancy, it's placed in a digital QA queue and an alert is automatically sent to the radiologist who initially interpreted the study.  The system requires that they review the images and submit their impression of the conflicting opinion.  That content is then placed in the QA queue.

Next, a member of the QA Committee opens the case. We have QA committee members that have been serving in that role for years, are vigilantly overseen by our director of QA, and have been trained in the coding and analysis of QA data. We have a unique amount of internal validity based on the rigorous training of our QA committee members: They are very experienced; they are very consistent.

The reviewing physician will open any given case, review the images, review the report submitted by the client site, review the input from the radiologist that initially interpreted the study and ultimately come to their own conclusion, informed by facts and longstanding experience reviewing QA cases.  The final QA disposition is coded using a variety of parameters in our database. 

If a discrepancy is found, the case gets both a severity grade reflecting the potential effect on patient care and