One of the major challenges in any supply management effort is to maintain support for the effort within an organization. Unfortunately, spend analysis (SA) has long been a weak link in the emotional support structure of spend management initiatives, for two reasons:
- Vendor over-promises with respect to data cleansing accuracy lead to disillusionment when (inevitable) errors are found;
- The inflexibility of SA data structures and hierarchies leads to disillusionment and abandonment of the SA system as a useful analysis tool.
With respect to (1), I wrote in an earlier post that SA vendors tend to compete on the quality of their data cleansing services, because the rest of the offering tends to be undifferentiable. I remember a 2001 meeting at a New York financial institution, where one of the key opponents of the spend management initiative drilled into the weeds on an SA dataset and found poorly-mapped spend – which he of course immediately brought to the attention of the entire group. I can’t remember exactly what he said, but it was something along the lines of: “If you can’t get THIS right, why should we believe ANYTHING you’ve mapped, or ANY of the conclusions you’ve presented?”
It’s always possible to drill into the weeds and find mistakes. That’s why data cleansing promises are so dangerous. One slip of the finger during a midnight mapping session – one mistake by an automaton (confusing “flour” with a misspelled “flour. bulb,” for example) – and opponents of the initiative are deeply empowered. The worst part of this situation is that errors can’t be corrected quickly in most SA systems, because they are batch-oriented, read-only data warehouses. In the case of our meeting, instead of the vendor saying, “Oh, thanks!” – and, with a few mouse clicks, fixing the mistake and moving on – there the errant spend sat, glaringly obvious to all, for the remainder of the meeting. In fact, it sat there for the next few weeks, until the next monthly “refresh” of the spend cube was performed.
Which leads us to the next guiding principle of Web 2.0 Spend Analysis:
- Dataset changes should not have to wait for the “refresh” cycle. Change is a fundamental part of the SA process, and changes need to be made instantly, constantly, and effectively in order for the SA system to remain relevant and useful to key stakeholders.
Imagine the number of data mapping errors that are discovered as customer procurement experts walk through spending data with which they are intimately and deeply familiar. No SA vendor or offshore third-party cleanser (and certainly no automaton) can possibly know what those experts know. Mistakes are inevitable, and changes pile up quickly. By the time the next refresh is published, say by week two of the following month, even more changes have piled up that won’t be reflected in the new dataset. It’s the proverbial dog chasing the fire truck – the dog is committed, and he does his best to catch up, but he’ll ultimately be left behind. The result is deep frustration and an eventual abandonment of the SA system as anything other than a transaction store.
With respect to inflexibility of data structures and hierarchies, the same reasoning applies. Typically no committee decision on a hierarchy structure will survive the first analysis report – stakeholders want to see the hierarchy in some new and special way. “Today let’s group Puerto Rico with the Southeast Region rather than with the Caribbean, and let’s also combine the Northeast, Midwest, and Canada into a new meta-region” – a perfectly reasonable request that unfortunately makes the existing spend dataset organization (and all the power of the OLAP data warehouse) useless. Again, the result is disillusionment: “You mean, you can’t change the hierarchy until next month – and I’ll need to get permission from all the other users of the system first??”
And let’s not even talk about cost center hierarchies – for every dotted line in the management hierarchy, there’s another interpretation for how costs should roll up. It’s imperative that the spend analysis system be agile enough to be altered in real time to any perspective, to support the analysis of the day.