BarcVox

The voice of business architecture. And more.

Dashboards and Reporting

Reports and dashboards are common in most organisations, with reports serving an historical and archival function, and dashboards facilitating active process management.

Dashboards are essential for anyone actively monitoring a process. Unfortunately, dashboards get a bad rap for several reasons, some more deserving than others. Before we delve into the dashboards, let’s step back and understand what we expect to gain from a dashboard, which as insights. Insights and, ostensibly, exceptions. We can see where dashboards fit in by investigating an analytics maturity landscape.

Analytics Maturity Model

The story starts with data. Data are foundational to any dashboard or report.

Data

Data are the raw materials, but as with manufacturing, data are processed to yield more value. This can be visualised in a common analytics maturity model.

Information

Data matures into information. Fundamentally, this is the primary function of reporting—taking raw data and combining it with other data into something more human readable. We’ve grown from a bunch of numbers and words into sentences and paragraphs. We’ve grown beyond infancy and are heading into preschool.

Information requires data, but that’s not sufficient. Many reports I’ve seen aren’t particularly useful. In my experience, I’ve reviewed report content with the consumers of these reports and they don’t know why some aspects of the report are present—even if they designed the report themselves. Often I’ve heard something like, ‘I might need to look at that at some point‘. Hardly a strong selling point.

Insights

A well-designed report can graduate you from preschool to grade school, but to get beyond this, one needs insights. Information is to data as insights are to information. If data are the raw materials, information is the intermediate good. It’s been processed. We’ve gone from iron ore to nuggets or even steel beams, but we haven’t constructed anything. Insights are this construction. We’ve gone from paragraphs into a well-crafted story.

Insights are an emergent property, but they can be coaxed by understanding how to interpret the data and have a mechanism to provide insights to an operator. It is no longer incumbent on the reader to suss out meaning; the meaning is crystalised into poignant insights. Actionable insights is a phrase I’m fond of using.

Predictive

There is life beyond grade school. Rather parallel to insights are the ability to predict what might happen through forecasting or by performing what-if, analysis. This either sees us extrapolating trend data and predicting probabilistically what might be expected to happen next, or it allows us to manipulate input variables that shows us several possible futures and shed light into how we might affect these variables to tease our preferred output.

Prescriptive

These predictive models can be extended even further by employing optimisation models to prescribe the best path forward. We’ve now graduated university. These data not only yield actionable insights, but these insights are further processed into instructions. A is up 10 per cent and B is steady. Conditions X, Y, and Z are in whatever state, so you should do this next. It removed the human decision factor and the possibility or human error or bias at this level. The focus here is on the model itself, and this can be refined over time as a single focus. We’ve mostly removed the human element, but we can remove that, too.

Autonomous

Once we trust the model to predict and prescribe accurately, we can allow it to take action autonomously. If the prescribed course of action has yielded the expected and acceptable results, there is no reason for human intervention. This is the world of programmed trading. Monitor the model and refine it, but at no point does a human need to intervene. You may not trust your models—and I wouldn’t blame you, especially if they were constructed by humans—, so you might want to put monitors in place and variance parameters. You might also want to force a human to interact periodically as with assisted driving features. Every so often remind the driver to take the wheel, knowing full-well that the assist will outperform the human 99.999 per cent of the time. And there is nothing to promise that at the point where the remaining 0.001 per cent comes into play, that the human will make the right move or fast enough anyway, but hubris is a strong cognitive deficit, so at least give it a conceit.

Dashboards

So dashboards get a bad rap for several reasons. Let’s look at them in turn.

Bad Selection of Performance Indicators

One reason for the bad rap is obvious and somewhat related to other reasons. The dashboard isn’t tracking meaningful performance indicators. Whatever it’s tracking was not well considered and isn’t really integral to the processes. This is a classic case of garbage in, garbage out. Although bad data and timeliness of data refreshes are factors, bad design is more often the case. More often than not, when I’ve asked what does that do, I get a cogent response. When I follow up and ask what happens when that is out of range, the answer is ‘we meet to discuss what to do’ because ‘it depends’. You may have heard some people say that there are no wrong answers. This is a wrong answer. This meeting should have already taken place when the decision was made to include this instrument. And the dependencies behind ‘it depends‘ should already be known. Granted an unexpected dependency could arise—a veritable unknown unknown—, but that wasn’t the question. Capture the dependencies and create a case to handle exceptions. For this, you can have your meeting.

Incompetent Operators

I’m not trying to make the case that your employees are abject idiots. My point is that if the person monitoring and interpreting the indicator is not fluent in translating it, it’s pointless to populate it on their dashboard. This is akin to the idiot light on your car’s dashboard: ‘By the way, something is wrong with your vehicle. It could pretty much be anything, but just so you know‘. So, even if the indicator is relevant, if the user doesn’t know what to do based on the reading, then it serves no purpose.

I was working with a construction company, and a manager explained to me a particular indicator that showed the ratio of planned versus actual progress. There was an acceptable range for variance.

If the variance was exceeded on the low side, this was an indication of timeline and budget risk. Tasks weren’t being completed on time, and the completion would expend additional budget. Downstream dependencies might be negatively impacted. The action was to communicate with the foreman. The further plan was to get this indicator into the hands of the foreman and skip the middleman.

If the variance was exceeded on the upside, then we were working ahead of plan. Sounds great. Google Maps tells you it’s going to take 5 hours to get to Pittsburgh, but you are doing 90, so you’ve got that beat.

Not so fast, as with over-indexing your speed to Pittsburgh, going too fast is a risk indicator. Quality can go down, resulting in unsafe construction or resulting in rework if it’s caught in a quality assurance phase. Moreover, like driving at 90, accidents are more apt to occur when one is rushing.

Poorly Disguised Report

One of my pet peeves are reports masquerading as dashboards. The purpose of a dashboard is to manage controllable events in the moment. The purpose of a report on the other hand, is to present historical or transactional perspective. If your dashboard is just a lot of information, not distilled for at-a-glance interpretation, you’re being hoodwinked. Somebody did not do the upfront work to titrate your information, so now you have to. This is the difference between buying quality furniture and buying Ikea. Sure, you saved some money, but the quality doesn’t compare, and you have to assemble it yourself.

Although we haven’t yet gotten into the area of reporting quite yet, I think it’s safe to assume that we all know what a report might be. If not, read ahead and return to this, that much the wiser. If you are having difficulty grasping the distinction between a dashboard and a report and why a report makes for an awful dashboard, imagine an automobile dashboard. Now imagine your car’s dashboard being displayed as a report. Instead of a speedometer, you are receiving a constant readout of your speed and direction at millimetre intervals. Why not toss in RPMs and altitude, because you might at some point be interested. Pitch and yaw? Why not.

Ridiculous, right? What about moving averages for these factors instead. That’s better. Right? And newer cars have alerts for exceeding the speed limit, going out of lane, running out of petrol, miles remaining. Some of these are even configurable by the driver, the user. The point is that a dashboard tells you what you need to know in the moment. Sure, you can view your trip meter and drilldown into less immediate needs through the dashboard, but this is a user interface convenience and a secondary aspect.

Reports

Unlike dashboards, which are for immediate action, reports are the scenic route. There is a lot of pertinent information to peruse. I’m not trying to say that reports should be something to meander through. That’s what queries, spreadsheets, and data interfaces are for. Reports are more structured and provide archival benefits, but if you are relying on reports, where immediate action is necessary, you’ve made a bad purchase. Keep it if it makes you feel better, but build a dashboard instead.

Laundry Lists

I’ve seen some very well constructed reports in my day. With apologies to vegans and vegetarians, all meat; no fat. But many reports I’ve seen over the years are a lot of fat and gristle. No wonder some people go vegan. These reports take a more is more approach and create laundry lists and an everything plus the kitchen sink approach. There is no rationalising or prioritising. Countering the Nike slogan: Just don’t do it. Be mindful. I recommend writing two contextual reports over a single monolithic report to cover contingent contexts.

Exception Reporting

In many cases, archival purposes notwithstanding, that report may be unnecessary. If all you need to see is exceptions, don’t confound your report with happy cases. Ask yourself if you even need a report or if a notification or alert might be more suitable. This advice is valid for dashboards, too. Exception conditions such as low oil pressure, elevated engine temperature, low tyre inflation, door ajar, and so on are decent examples. Although a counter argument can be made for signalling the happy case—for example, because the exception case might be broken; the idiot light is burnt out—provide access to that a click away, but don’t clutter your dashboard. This can be a report.

</Rant>

OK. I admit it. This is somewhat of a rant. I get triggered each time I see these things misused. It’s like the grammar Nazi who can’t help but be triggered by the sign in the checkout lane that reads 5 items or less when the rule is that it should be fewer or ‘don’t they know that t-h-e-r-e is not the possessive pronoun, or that we flesh things out, not flush things out, unless where plumbers.

Is this over yet?

Leave a reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: