Data Analytics as a Matter of Pace, Not Speed
Any procurement professional worth their salt will tell you that a savings number means absolutely nothing. Without getting context from the size of the category or the annual projected cost, a flat savings figure has no meaning. Instead, we always present our results in percentage form because this provides the relativity necessary to determine whether $10,000 in savings was achieved on a $1M category (1% savings) or a $100K one (10% savings).
But data analytics – well now, that’s different. The faster the better and yesterday if possible! Everyone knows that real-time is the ideal speed for analytics, unless of course predictive analytics are available. Then we expect to have next week’s data on our desks yesterday.
The trouble with this line of thought is similar to the opening example about relativity and context in savings. High speed data analytics only has value if it is actionable at the same pace as data becomes available – meaning that it is ready when and where it is needed and the organization is looking to take action while it is fresh. This is a case where companies need to consider their requirements and leverage matching automation rather than attempting to push the business faster because the available technology can move at that speed. Just as a savings figure gains meaning from the total value of the budget or category it reduces, analytics are only given relevance when they match the current pace of decision making.
The idea of analytics being available in real time is very sexy. It is alluring to think that data could be centrally available as it is being created by business transactions. This image is bolstered by the general appearance of leading edge analytics solutions: sleek, visually appealing, and easily navigated. They provide immediate feedback and put the function in charge of managing them not just in an influential role over information and conclusions, but over the very processes and actions that lead to their creation.
It is not hard to imagine that the possibility of such a solution generates great excitement and enthusiasm in the C-suite, quickly followed by the question, “Why don’t we have one of those?”
The honest answer to that question is actually another question: “What would we do with it if we did?”
The trouble with pushing for fast analytics rather than high quality, actionable analytics is applicability. If analytics are not applicable, either because they lack context or become available at a moment in time when they can’t be leveraged, the critical link between function and utility is broken. If you aren’t prepared to make decisions in real time – and very few organizations are – then you don’t need or even want real-time analytics. In fact, the raw blast of data coming at you in real time could be disruptive and diminish your ability to make good decisions based on a more realistic two or three-week timeline.
In actuality, analytics has come a very long way in terms of both speed and quality. When analytics was a separate ‘bolted on’ component of procurement software, users had to select a subset of data and then move it over to analytics for processing and reporting. Not only did this insert an unfortunate delay, it instantly broke that data off from the active flow of information, isolating procurement from the strategies and objectives of the company. As current as the available data might have been – and there was a time when it wasn’t all that current – it was forced to become a dead offshoot to go through the analytics process before becoming actionable.
The increased speed integrated activities, data, and analytics that is possible today allows procurement to support in line decision making. The increase in speed has led to an increase in value. It would be natural to think that continuing to increase the speed of data availability would provide a corresponding increase in benefits, but that may not be the case.
Everything to be gained comes with both benefits and costs. We talk about this in the context of supply chain risk management all the time. If a company made the decision to drive out ALL risk, the price tag would be so high that they would have an impeccable risk-free record for the one or two days they could afford to remain in business. As a result of this reality, risk managers understand that their goal is to drive out as much risk as is financially advantageous. Spending one dollar too much to prevent risk is the same as losing a dollar – not something anyone wants to be associated with in the long run. Realistic risk managers don’t aim for zero risk, they aim for the point of diminishing returns, spending as much as is necessary to reduce risk, but not a dollar more. Finding that point is no easy task.
If we apply this same approach to data analytics, the goal should not be perfect data or immediate data: one is too slow and costly and the other too fast and wild. No, a successful data analytics program provides the right amount of accurate data to decision makers when they are prepared to render a decision based upon it, not a minute sooner and not a moment later. Any investment in analytics that causes data to sit unused and become stale is the same as the dollar we threw out the window by spending too much to minimize risk, and results in a negative ROI. The secondary challenge this creates for procurement is the need to constantly read and evaluate the changing speed of decision making in the company, making sure both processes and technology keep pace.
Data analytics is not implemented for its own sake, but for the actions and decisions it informs. Being able to answer the question, “What should we do next?” with a high degree of confidence and accuracy is far more important than having the snazziest solution in the industry.
If real time data could be counted upon to motivate effective real time decisions, then it would be a disruptive force for positive change, but it seems that the choice is actually between real time OR meaningful data – perhaps because our own internal risk calculations are not ready for the potential downside associated with making a decision too fast or based on a partial truth that is the only fact available in the moment.
The success of any technology depends on how easy it is to implement and use. A software with powerful, comprehensive functionality but complex and difficult to use can rarely drive the expected efficiency and business results. Smart by GEP is pioneering example of what the user experience in the latest generation of procurement software can be.
SMART by GEP is as powerful and capability rich as it is easy to use. Powerful, complex, fully functional procurement software that is as any consumer product, with an intuitive, attractive interface and a rewarding user experience.