HAP Radiology Billing and Coding Blog

How Benchmarking Can Help Radiology Practices Evaluate Their Productivity

Posted: By Sandy Coffta on May 20, 2019

How Benchmarking Can Help Radiology Practices Evaluate Their Productivity Healthcare Administrative PartnersIt’s natural to want to compare one’s performance against others or to some standard.  Radiologists often chat among themselves about the number of exams per year they read or maybe the number of RVU’s (Relative Value Units) they generate.  While there are inherent problems with some of these comparisons as we outlined in our recent article Understanding the Value of RVUs in Radiology, measuring and monitoring productivity can be beneficial to a radiology practice and to the individual radiologist. 

 

The work RVU component of the Medicare Resource Based Relative Value Scale (RBRVS) is the most commonly used metric for productivity measurement.  Work RVUs are readily obtainable from the practice’s billing system; if the system does not report them directly, then they are easily calculated using a spreadsheet to multiply procedure volume by the RVUs per procedure available in a table found on the Medicare web site.  The work RVUs per procedure are the same regardless of whether the procedure is performed in a hospital or a freestanding imaging center setting.

 

Work RVUs are a measure of clinical effort.  They do not take into account any non-clinical activities such as practice administration, teaching and training, research, or practice development.  They also do not account for the valid and important clinical activities of time spent speaking with a patient or referring physician, or the differences in complexity between cases with the same procedure code.

 

Obtaining an independent benchmark value against which to compare your own performance is somewhat problematic.  Surveys conducted by various organizations such as the American College of Radiology, the Radiology Business Management Association, or the Medical Group Management Association attempt to produce benchmarking statistics.  However, unless the survey rigorously gathers very specific data through questions posed in just the right way, these survey benchmarks are quite meaningless.  Practices across the country have very different work cultures that can affect their productivity.  Any such nationally aggregated data would have to be broken down very finely into practice setting, group size, modality mix, hours worked per day, days worked per year, etc. in order to be comparable to another practice.  Unfortunately the participation in these surveys is not as widespread as one would hope and so such finely-tuned data is usually not available. 

 

One practical approach to using productivity data measured in work RVUs is to compare performance within a single practice, or against oneself over time.  This removes the difficulty of using an independent value derived from unknown sets of practice patterns that do not necessarily reflect those in your own practice.  The practice can review its own data to determine its own set of benchmarks and goals for productivity.  Most importantly, the practice can define how to calculate the productivity measure in a way that is most relevant to their own circumstances. 

 

Since the work RVUs are a measure of clinical activity, it makes sense to compute RVU production for those hours or days actually worked to derive a per diem RVU rate.  This removes the variability introduced when one physician might have more or less time off than others, or when a physician is less than full-time in the practice.  Similarly, non-clinical time should be removed from the calculation.  Here is an example of the calculation:

 

  Total RVUs Days Worked Admin Days Adj. Days Worked RVUs per Work Day
Doctor A 11,071.0 225.0 (27.0) 198.0 55.91
Doctor B 11,794.0 241.0   241.0 48.94
Doctor C 10,795.0 225.0   225.0 47.98
Doctor D   5,667.0 119.0   119.0 47.62
Doctor E 11,039.0 232.0   232.0 47.58
Doctor F 9,982.0 223.0   223.0 44.76
Doctor G 10,018.0 228.0   228.0 43.94
Doctor H 10,610.0 239.0   239.0 44.39
Doctor I 10,232.0 240.0   240.0 42.63
  91,208.0 1,972.0 (27.0) 1,945.0  
        Mean = 46.89
        Median = 47.58

***The data presented here is for illustrative purposes only and does not represent a radiology group’s actual productivity.***

 

In our sample group, the productivity ranges from 5,667 to 11,794 work RVUs per year, the doctors do not all work the same number of days, and one of the doctors spends time as the practice’s managing partner.  Looking only at the raw data (Total RVUs) it would seem that Doctor B is the most productive since she has the highest total RVUs.  However, when we consider the number of days she worked in the year, she is second in the ranking.  The managing partner (Doctor A) had fewer total RVUs than Doctor B but, after adjusting for his administrative days, he is the highest producer.  Doctor D generated the lowest number of RVUs but he works only half-time and ranks 4th in the practice’s productivity.  

 

An alternative approach would be to include all of the non-clinical aspects of the practice in the productivity measurement.  This would involve assigning RVU values to administration, teaching, practice development, etc. and then including those activities in the RVU totals.  A system for gathering the information for this reporting would have to be devised and implemented.  Duszak and Muroff[*] provide a good overview of these non-clinical considerations, as well as examples for reporting productivity metrics.

 

Presentation of the productivity data can be done in a table, such as shown above, or more creatively using graphs or an individualized report card.  Some practices choose to present productivity data in a blinded fashion while others identify the data for each physician by name.  In our example, each doctor could be told his or her own Doctor letter to keep the results anonymous, or the names of each doctor could be listed for full disclosure.  This is one of the decisions for the group to make when considering the collection and reporting of productivity data. 

 

Within a practice, there might be also different benchmarks for various sub-specialties.  If the group is large enough, sub-specialists can be compared with each other rather than with a composite benchmark for the group as a whole.  Regardless of the reporting method, it is important for the group to reach a consensus on the goals of productivity measurement, the measurement methods to be used, the target benchmark to be attained, and the consequences (if any) of not attaining the target.  This last consideration should be very carefully thought out before embarking on the program, and perhaps not applied until some time after the group has become comfortable with the measurement program and has been able to adjust the measurement process to best meet the group’s needs.

 

The subject of measuring radiologist productivity has been covered in many articles over the years.  There is no right or wrong system for a particular group other than the one the group’s members feel is best for them.  The measurement system should be adjusted dynamically as the group’s needs and circumstances change.  Healthcare Administrative Partners routinely reports work RVUs and we can assist our clients in the analysis and understanding of their practice’s productivity results.

 

[*] For more in depth information on this topic, the articles by Drs. Duszak and Muroff in the June 2010 Journal of the American College of Radiology (Part 1) and (Part 2) are well worth reading, and their list of references point to additional material. 

 

Sandy Coffta is the Vice President of Client Services at Healthcare Administrative Partners.

 

Related Articles

The Importance of Online Reputation Management for Radiologists

 

The Value of PICC Lines for Interventional Radiologists

 

Appropriate Use Criteria Revisions by the American College of Radiology

 

Inside advice from radiology RCM experts

Topics: radiology, productivity, RVU

Subscribe to our radiology billing and coding blog

Recent Posts

Testimonial

How a radiology practice recovered lost referrals