Architecture and Data Integrity Are Critical to Analytics Success

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

Bill PickartAs radiology practices around the country become increasingly reliant on business analytics and intelligence for decision-making support, the time is ripe to begin devoting additional attention to refining the processes by which their measurements are generated, according to Bill Pickart, CEO of Integrated Medical Partners (IMP). “It is imperative that you understand the quality of the underlying data you are getting, and there are degrees,” he says. “If a database is constructed properly, there should be very little additional effort or cost associated with quantifying and qualifying the data for use by practice decision makers.”

Pickart outlines three key considerations for improved database architecture: data sourcing and origin, data integrity in acceptance and handling, and presentation of analytics.

“Many groups focus on data presentation or dashboarding, but if thought isn’t given to the structure and logic of your overall analytics strategy, the dashboards become less relevant and useful over time,” he says. “When this is handled properly, however, you can have confidence that you’re taking the right course of action.”

Data Sourcing

For comprehensive decision support, today’s radiology practices must utilize data from a variety of sources, both internal and external, Pickart says. Internal sources include RIS, PACS, revenue cycle management (RCM) and practice-management systems, utilization-management or appropriateness-criteria systems, precertification and preauthorization programs, critical results alert programs and financial costing systems, to name a few. “The challenge is that these sources tend to be highly vertical and oftentimes closed,” Pickart notes. “For instance, you might get some dynamite reports from the PACS, but they won’t be correlated with information from the RCM or financial system.”

External data sources might include hospital or departmental information systems, Pickart says. “Typically, a hospital-based practice will want to draw from the information system where the patient demographics were originally captured,” he explains. “You might also want to include studies or benchmarks from third-party research houses or data from your community’s health information exchange.”

Data Acceptance

The next step is for a practice’s database architecture to facilitate acceptance of these data. Having such a wide array of sources makes reconciliation and cross-referencing particularly important. “You need matrix capabilities—the opportunity to create internal data points relevant to any of the information you are bringing in, so you can design valid and relevant practice benchmarks,” Pickart says. “Internal benchmarking and cross-field referencing are major components in elevating the quality of your analytics and decision support.”

For instance, he says, when comparing radiologist utilization against external benchmarks, a radiology practice could decide that in order to qualify as full-time equivalent (FTE), its radiologists each need to log 45 hours of work time a week. To ensure the quality of the measure, “You send that data point through a series of filters such that when it gets to the analytics data warehouse from which the analytics draw, it has been broken down to its most basic element,” Pickart says.

Comparing that measure to an external benchmark has its own challenges, Pickart notes. “Typical external benchmarks rely on self-reported data, but that data is subject to inconsistencies because it comes from different practices, each with different definitions of an FTE,” he says. By establishing rules for how data enters the data warehouse or repository from which analytics are produced, practices can avoid the “garbage in, garbage out” trap that leaves them with unreliable information, Pickart says. “When you are relying solely on third-party, self-reported data, you’re introducing ambiguity to the data itself, and unless you account for that, you’re doing a disservice to the leaders of your practice when they make decisions on it,” he says. “By understanding and setting rules for data acceptance, by the time the information reaches the data pool, it is very clean and crisp. There’s no variation to account for.”

Reconciliation and cross-referencing, if properly managed, allows the practice to improve the quality of the data being used for comparison, Pickart explains. “You can then run all kinds of analytics on basic data points without being subjected to the interpretation