This is not about—surprise, surprise—a one-horse open sleigh. It is about being able to show how good, or not-so-good, you have been in your supply chain operations. Dashboarding, it seems, has become a four-season adventure.
Many, many organizations use "dashboards" to show how they are doing. Publication might be daily, week-ly, or monthly, with monthly being the most frequently used at high levels, and daily being used to give rapid feedback on high-velocity volume movement, among other applications. Sometimes, daily visuals are posted as indicators of performance of lower-volume, but higher-cost-impact operations, and/or to provide recurring stimuli for performance motivation.
In theory, dashboards provide mission-critical information about factors that are core drivers of organiza-tional performance. In practice, they are overloaded with trivia, abused in their motivational role, and simply too much for anyone—CEOs or order pickers—to process and internalize.
THE USUAL SUSPECTS
Who has led us to a state in which a perfectly good idea has been rendered less-than-useful, and even mis-leading? We nominate an unholy alliance between accountants and engineers. It's not because they are bad people; they are often just wired differently from doers and decision-makers. They can get a little compulsive about filling in all the blanks or every cell in a logical matrix, no matter the relevance or the potential usefulness.
It is far better, for example, to know at a glance that inventory turns are approximately "x" than it is to be overwhelmed with half a dozen different turn performance indicators for six categories of inventory. It is useful to see that a weighted average productivity value is "y" instead of having to sort out individual productivity numbers for a couple dozen specific tasks.
We will grant that being able to examine the next level of detail can be important when the singular metric begins to get out of whack, but it is the clarity and simplicity of the singular measure that triggers an investiga-tion into specific problem (or success) categories. And scanning a laundry list of lower-level "dashboard" data is likely to receive less attention than a highly visible single indicator of a key driver of business success.
TO ILLUSTRATE THE CASE
The classic, even trite, analogy is comparing performance dashboards with automobile dashboards. Think about it. There are only a few things you really need to know about how the car is doing. And these are even fewer today than they were two or three decades ago. So, the speedometer, the gas gauge, and tire pressure indi-cators are vital. Warnings about engine temperature, oil pressure, and diverse on-board computer problems show up when a "check engine" light goes on.
The tachometer is only important if you are driving your Chevy Aveo four-banger around the track at Dayto-na, but the marketing team loves the concept. Still, there is no reason on earth to make the auto dashboard as complex as a 747's cockpit.
Yet we try, and the information available in contemporary information systems gives us plenty of trivia to drool over. Our Dutch colleague once developed a warehouse performance reporting system that presented pages of "dashboard" dials. The idea was attractive, in that the icons were all little dials with red, yellow, and green segments based on performance history, targets, or imperatives, as the case might be. Each of the dozens of "key" indicators could be clicked on to reveal another level of detail. An analyst's dream, as in this case, can become a manager's nightmare.
So, developers of effective dashboards need to spend a lot of front-end time in deciding which are the right balls to be watching and which need to be let go for another day.
WATCHING THE BALLS THE RIGHT WAY
Some analysts, less inclined to focus on every little nit, maintain that longer-term and aggregate metrics are more indicative of what is really going on over time. We will cheerfully admit that it is easier to look at a chart of rolling 12-month trailing data to get a clear picture of the aggregate effects of past events. Certainly, it's much easier than trying to interpret a chart of specific data points over time, with wild fluctuations and unex-plained dips, and snuffles and sneezes.
But the argument misses the point. Dashboards are not intended to be examined in their totality over numerous discrete time periods. They are designed to illustrate what's happening now and to stimulate improvement, corrective action, and deeper probes into those performance elements that are deviating from expectation (in any direction).
Looking at the easy-on-the-eyes 12-month aggregation chart may allow a smashing post mortem of a specific issue. It was, for example, relatively easy to reconstruct what happened with the Titanic once it had sunk to the ocean floor.
So, no matter what format a time-series analysis might take, the intervention of response to a timely dash-board reading is critical to remedying situations before the ship goes down.
WHAT SHOULD BE ON A DASHBOARD?
The consulting answer—it depends. It depends on whether you are in transportation or distribution. It de-pends on your organization's mission—and what factors define success. It depends on what your customers value. And the style and preferences of your top management.
That will almost surely mean that there are different dashboards for different areas, and that's OK. For example, in a distribution center, you might be tempted to make on-time shipments a metric. But your customers probably don't care. They are only concerned about on-time arrivals, and that might be a better dash-board element for the transportation (or 3PL) function.
Then, there is the issue of how to integrate dashboards into a cohesive enterprisewide view, and these are tricky waters. The next greatest failing after providing too many dashboards is hammering people for improve-ment in metrics that they cannot directly affect.
But it is fair to do two things: show all associates the dashboards that indicate how the company is perform-ing—as informational items only, and teach all associates how their local dashboards contribute to the enter-prise whole.
By the way, our preferred icon is a dial, rather than, for example, a thermometer. There does need to be room to show regression as well as progress. The use of color-coding to indicate success, failure, and a danger zone is usually very effective. Nearly everyone in this Western culture knows what a dial looks like and what it indi-cates, and relates to the usual left-to-right progression of improvement.
That gets us down to the question of how many. Again, it depends. A case **ital{might} be made for as many as six. No more. And a mission-critical three would be a worthwhile target. A major apparel retailer's Western distribution center had fabulous success with a three-dial dashboard that was the first thing every asso-ciate saw when arriving for work. It focused on volume productivity, quality (error rates), and the HR element of associate retention.
OUR TAKE
Find the balance, the critical few, the keys that relate directly to individual units and that people can interpret and relate to. Figure out how to integrate local dashboards and metrics into a more cosmic enterprise view. Use visible results to motivate continuous improvement. Keep it real. Find out who's been naughty or nice. And let it snow, rain, or whatever.
It's that simple ... and that complex.
Copyright ©2023. All Rights ReservedDesign, CMS, Hosting & Web Development :: ePublishing