Clinical Operations Process Improvement (Part III – TMI!)

 

The biggest mistake companies make when working with metrics is that they tend to make things way more complicated than they need to be. Specific examples abound. You can boil down these examples into three general categories. I will discuss the first one here.

Collecting Too Much Data

Collecting too much data makes it hard to focus and act on the metrics that matter. Albert Einstein once said, “Everything should be made as simple as possible, but not simpler.” Following this advice, the first rule of metrics is that all measures should be “actionable”. If you’re not taking an action (or refraining from doing so) based on a metric, you don’t need it.

Moreover, those actions should serve to drive positive changes on some level of the organization. Having too many metrics causes confusion and obscures any clarity of purpose. Employees are charged with carrying out practices aimed at improving metrics. They must also report back progress so executives can see if those programs are working. If metrics are too complex, the handoffs from goal to implementation to feedback can be easily mishandled.

Keep it Simple

If you can’t express an objective in three to five key thoughts, it’s probably too complicated. Simplifying drives performance by enabling comprehension. Companies must be aware of the “DRIP” principle – “data rich but information poor”.

In the 1990’s, GM followed several hundred metrics on a monthly basis according to Jay Wilber, director of Quality Programs at GM – “We were measuring everything”. GM executives went through an extensive analysis for which offered meaningful and actionable information. The manufacturing division later narrowed its scorecard down to 30 – 50 metrics.

In another example, the firm Bain & Co. at one time had a large client, with revenues on the order of $20 billion. The client’s CEO received 6,000 metrics and was somehow expected to create a mental picture of the health of the company. Much time was wasted in debating what all the numbers meant. These metrics were later condensed to a single-page dashboard of 25 key performance indicators for the CEO to focus on. In addition, each executive team member received a personal one-page dashboard tailored to their area of oversight.

Do any of these examples ring a bell? While these may be extreme cases, it’s not unusual for a clinical operations group to utilize dozens (and dozens) of charts, graphs, and tables to manage its business where a fraction of that amount would do.

Leave a Reply