Part 4 – Finding the ROI for an Investment in an Analytical SCM Solution

Technologyevaluation.com published a piece of mine on this topic a few years ago, but the ideas are important, so I am recapitulating the them here.  The first post in this series introduced the topic of overcoming the challenges to calculating the return for an investment in an analytical supply chain software application.  This post deals with the third of four challenges.

Part 4 — Third Challenge — Making Sense of the Data — “We have tons of data, but it is not telling us what we need to know!”

Analyze the Data

If the data exist, you need to trace a symptom, like excess work-in-process (WIP) inventory, to the root causes such as forecast error that drove production of the wrong product.  Once that is done, then powerful, but a relatively simple analysis can be performed by collecting the data from the data warehouse, or wherever it is stored, by putting it into a spreadsheet and then creating a cumulative distribution of the symptoms by reason code (see Figure 1).

Figure 1 – Pareto Chart (Cumulative Distribution)

More commonly, however, the data cannot be readily segmented by root cause.  This is probably because the symptoms and the root causes have not been identified and linked.  Using a simple fishbone diagram (see Figure 2), a few folks who know the business processes involved can probably identify symptoms and trace them to possible root causes.  Naturally, a skilled facilitator (possibly a consultant) will help, but you can also learn by reading up on the idea and by doing it yourself.


Figure 2 – Cause and Effect (Ishikawa or fishbone) diagram with potential root causes marked with capital letter reason codes.

Once the potential root causes have been identified, then a system of recording the incidents by reason code has to be put in place.  In some cases, while occurrences will not be tied to a reason code or other explanatory data, there will be some data that can be used as an approximate surrogate to estimate the order of magnitude of the root cause.  In those cases, you can get to an answer sooner, albeit a less precise one.

As an example, forecasting may be coming from sales.  You can probably measure the accuracy pretty well by saving the forecast and then by comparing it with orders or shipments in the same period as the forecast (made at lead time).  What is harder to determine is how much better your purchasing, manufacturing and distribution would have been if forecasts were 50% more accurate, or what the bottom line benefits would have been.  But by making some observations like how often a job had to be interrupted to start another one based on a canceled order or a forecast that was wrong, you can begin to build a collection of data that will be the foundation for answering that question.  Then, by creating a cumulative distribution that shows the schedule changes by reason code, you will get an understanding of the size of this problem.  Both inventory turns and customer service will go up if you can create a plan that is more flexible, responsive and accurate by attacking the root cause.

Estimate the Benefits

You can make an assumption on how much improvement might be possible.  Then, hypothetically, reduce the schedule changes due to forecast errors by that amount.  Research average WIP and reduce that by the same factor.  Put a procedure in place to track premium shipments that are paid by your company by reason code.  Take the premium freight that is caused by bad forecasts to the bottom line.

Then, since you made an assumption that forecasts could be 50% more accurate, you will need to perform some sensitivity analysis. Vary the 50% and see what the results tell you.  The ratio of the change in the root cause to the effect on the metric you are trying to improve measures your sensitivity to that root cause.  This kind of simulation model can be created with a spreadsheet tool.

Summary

Borrowing the Pareto and Ishikawa tools from TQM practices can help you find data and create information that you did not know was there, but the speed with which this kind of analysis can be performed increases with the availability and accuracy of data.

Thanks for stopping by Supply Chain Action.  Next, I’ll share Part 5 of this article.  Until then, I leave you to also ponder these words from Leo Tolstoy, “There is no greatness where there is no simplicity, goodness and truth.”

 

Advertisements

About Arnold Mark Wells
Industry, software, and consulting background. I help companies do the things about which I write. If you think it might make sense to explore one of these topics for your organization, I would be delighted to hear from you. I am employed by Opalytics.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: