Avoid Charting Performance Measures in Confusing or Misleading Ways
Last month, I wrote a post titled “Consulting Case Studies Need Statistical Validity” where I discussed the need to show more than simple “before and after” numbers in case studies. I promised some follow up posts on his topic, so today I'll talk about two ways that healthcare organizations present data in a way that can be confusing or misleading.
- Using bar charts instead of line charts
- Misleading non-zero Y axis
- Showing different years' data as diferent lines on the chart
I'm a big fan of run charts to show sequential time series data instead of bar/column charts, as shown below:
Maybe it's my early training in Statistical Process Control, but I find the line chart (aka a run chart) on the right much easier and clearer to read. I find it easier to detect trends and to see the continuity of the data. Maybe this one is just personal preference, but I've heard others share this preference. What do you think?
To the second point, I've intentionally made a certain “error” in the construction of charts above…. I used a non-zero Y-axis. I think for most workplace metrics reporting, a non-zero axis tricks us into thinking changes are more significant than they are. Of, course a full-blown SPC chart is really the best (only?) way to detect significant process shifts, but if people are just eye-balling things, let's try to present an accurate picture to those eyes.
Compare the above charts to the versions below with the Y-axis that runs from 0 to 100% for patient satisfaction:
I think this latest set of charts creates less risk of being misleading and, again, I think the line chart on the right is easier to read than the column chart to the left.
As a minor and possibly meaningless kaizen point – the line chart requires less ink to print. Please PayPal me 10% of your cost savings from ink cartridges :-)
Again, the line/run chart on the right would be better as a full-blown SPC chart (which I'll cover in the next post).
Finally, let's think about multiple years worth of data. I've seen a number of organizations chart such data like this:
I don't like seeing multiple years of data placed on top of each other like that. I think it's easy to look and say “well generally each year is higher than the last, so we must be improving.”
Maybe, maybe not.
I'd rather see all 36 data points laid out in a single run/line chart, like this:
Hmmm, is there really that clear of a trend? I'll throw another chart at you, showing the mean-line (average) for an SPC-style chart where the average is based on the first 20 data points (and it has a non-zero Y-axis):
It doesn't look like there is a clear trend there. We don't have eight consecutive points above the mean. Is there really an upward trend? From this view, it appears not.
Here is a full-blown SPC chart, showing upper and lower control limits:
Here are those charts side by side – which do you think presents a more accurate representation of the trend, or lack thereof?
Anyway, maybe it's just personal preference and these are three charting practices that just bug me… what do you say?
What do you think? Scroll down to comment or share your thoughts and the post on social media. Don't want to miss a post or podcast? Subscribe to get notified about posts via email daily or weekly.