Performance management: the other stimulus plan

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

There's been a lot of talk here at RBMA about "performance improvement," those two magical words that, according to many, are the key to unlocking your business' full potential. But how exactly is performance measured, benchmarked and managed? On hand to answer that question this morning was Fred Downs, practice administrator at Diagnostic Imaging Specialists in Atlanta, Georgia.

Over the years, DIS’ performance plan has evolved from a “capital punishment” model to a “performance improvement” model. In the capital punishment model, qualifying events and behaviors for termination were enumerated, but feedback on performance was absent, and there was no organized mechanism for dealing with issues.

Wanting to deal with these problems, DIS went to a production model of performance, with poor results. “My personal favorite was, ‘I don’t believe in the RVU system,’” Downs recalled. People argued that the problems weren’t on their end, or that they worked hard enough. Over time, there was little change in staff behavior.

The goal evolved. “What if we redefined what performance was?” Downs asked.

DIS set a goal, to have a credible, multi-source evaluation methodology for monitoring and improving physician performance and contribution. “When we began talking about performance versus production, it changed the landscape of how we would evaluate," Downs said. New goals included:

* Performance had to be defined comprehensively
* The system had to be transparent, and people needed the opportunity to help develop the model
* System had to have different ways to evaluate performance that were data-driven and specific to the practice
* People had to have the opportunity to self-correct
* If self-correction didn’t occur, then there had to be consequences

The new model of performance included factors like practice building, corporate citizenship, production, and clinical. Next, DIS developed a code of conduct “to enumerate those things that physicians value within their practice: anything from sexual harassment philosophy to customer service and quality,” said Downs. The next step was coming up with policies like progressive discipline, which included penalties and the opportunity to do things other than terminate a staff member like restricting vacation time. DIS adopted RADPEER as a clinical measure and self-created a peer evaluation tool.

annual peer review evaluation was adopted, and all findings were correlated. Performance was now continually monitored, and feedback was continually given. The board received all the results in aggregate.

“When physicians receive the peer evaluation, they see their results benchmarked against the results of the entire practice,” Downs said. “RVU information is benchmarked to their subset within the organization – general radiology and so on.” The physicians spent a long time developing the peer evaluation document, which became a 16-point survey ranking physicians on issues like:

* Follows established protocols and standards
* Accurately and thoroughly interprets studies
* Efficiently uses time
* Answers pages/phones and returns calls promptly
* Performs his/her share of the work

In a table of results, two radiologists stood out. One was six RVUs below the mean, the other five; they were both one standard deviation below the mean in terms of production. For both radiologists, these results were confirmed by the peer survey process. The peer survey also illuminated additional issues with the troublesome radiologists.

These staff members were placed on the performance improvement plan, which included RVU information every month and weekly feedback, after which both radiologists began to improve gradually.

From the process, DIS learned a lot. They knew who the troublesome radiologists were before beginning the process, and when the company equally measured and monitored all individuals, they were able to create an atmosphere in which everyone was given fair treatment throughout the process. Downs stressed that good, reliable, accessible data used judiciously is key.

But what he emphasizes most is the “accordion planning” piece of the puzzle. “We ended up spending a lot of time defining the policies, discussing them, voting on them,” Downs said. “Everybody had some ownership. Not everyone thought it was wonderful. But we got all of them involved, and we have begin to correct performance in a constructive way where people have not been terminated or