Scientific Methodology for Performance Assessment

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

Quality improvement. It's the phrase on everyone's lips at this year's RSNA, and several sessions this morning will focus on the nuts and bolts of the process, providing a roadmap for making concrete steps toward practice improvement. Dr. James Duncan spoke on using the scientific method to assess physician performance. "The public is spending an incredible amount of money on health care," he said, "and their impression is that they're not getting their money's worth."

Duncan touched on the Shewhart/Deming cycle, Six Sigma, Lean, Lean Six Sigma and other methods of performance measurement and improvement. "But it all boils down to the scientific method," he said. "Study the past to gain knowledge, and use that knowledge to influence the future." When assessing physician performance, he said, there are three steps: collect data, look for trends, and predict the future. "Inferences are fragile," he warns. That's why having hard data is so crucial -- as they say, you can't improve what you can't measure.

In designing a physician assessment, Duncan recommends four major areas of examination: the domain model, or what skills are desired; the task model, or what will evoke evidence of skill; the evidence model, or how to score responses; and the decision model, or how to aggregate scores. To break each down:

The Domain Model

This could include direct observation and surveys, or data like billing records. "That is very granular information based on what the person is actually doing in the practice," Duncan said.

The Task Model

"All voluntary actions have two phases," Duncan said. "Planning and execution. The planning phase is hard to discern, but it's important to assess planning skills. Highly skilled plans, even when coupled with low skill in execution, usually turn out well."

The Evidence Model

The evidence model requires a methodology for scoring performance. Duncan demonstrated how the decision of where to place a catheter could be scored based on the likelihood of causing an adverse event. The evidence model, he said, should incorporate both an efficiency subscore (based on resources spent in planning and execution) and a quality subscore (based on how well the person has been able to achieve what they planned to do).

Duncan began measuring the fluoroscopy time for every interventional radiology procedure in his facility's RIS back in 2008. Using this is a representation of radiologist performance, he was able to observe certain patterns, like that fluoro times shoot up in the month of July. Unsurprisingly, fellows have higher than average fluoro times, which Duncan chalks up to uncertainty. He recommends that attendings enforce a "speed limit" of 60 minutes on fluoro, issuing "tickets" that can be discussed during evaluation. I wish I could show you some of his illustrations, including one that visualizes the factors that could influence fluoro time - they're fascinating graphic representations of what you intuitively know, but may not be able to verbally quantify.

"It's important to design good assessments," Duncan concluded. "Performance improvement strategies have many different names, but just remember -- they're all the scientific method."