Predictive Analytics in SCM

Predictive Analytics in SCM


Supply chain 4.0 and Big Data Management are current buzz words widely used in the supply chain community. But what is behind this digital supply chain? And what´s in it for me as someone who manages this? 

by Christoph Wiedmann


I asked myself the question, read a lot of articles from colleagues around the world and first thought: Nice to have, but that needs a lot of transformation of my current data management, which is not getting to be an easy task. I mean easy tasks are anyhow not meant to be solved by me as a supply chain leader, but such transformations require a corporation wide strategy which very easily might end up in an everlasting project. And that is what I do not wanted to end up with. So instead of trying to find the overall enterprise resource planning (ERP) solution I started to experiment by combining latest data analytic trends and technologies with problems and requirements out of the supply chain world.

So in other words I turned it the other way around. Instead of looking at the problems and challenges and trying to find a perfect solution for it, I looked at solutions provided in the data analytic community and analyzed it if it can be used for the issues I need to solve in my role. I have to admit that I am in the lucky environment to have a data analyst in my team, who basically is the success to all the potentials and solutions we implemented and who also translated certain approaches to my supply chain. This kind of in-house consultant is the key for success with (Big) Data Management in SCM in my case.

The start was rather simple if you remove the fancy data management language: Forecasting. As a supply chain professional this sounds very familiar to you as it is an important element of your negotiation baseline, resource planning, stock strategy and material flow management. So overall nothing new. The issue was, what if you do not have the ability to forecast but you want to? I tried for years to standardize materials with my engineering department with medium success – simply driven by the business environment of extreme complex high value projects with long schedules. So the time you agreed on a standardization the product was already due to get replaced by a more efficient solution. That triggered the thought, if it so hard to force a standard into the organization why should we not analyze the organization on standards. Humans have a natural drive to sort their environment and themselves by installing standards and processes. How can now data analytics help to find such standards in order to forecast them?

The solution is already widely used in the bank business. Data management software which analyses the bank users´ withdraw habits, from which it builds profiles. These are combined with rules and boundaries which in real time guards your accounts. As soon as a user deviates from these profiles and rules the account will get blocked. Easy example is if you fuel at a gas station in Basel and five minutes later you buy a PlayStation in San Francisco. The system will react, and block your account, because it is physically impossible to travel so fast from Basel to San Francisco. Conclusion it must be a misuse.

In order to find standards, you simply turn the system upside down. So you use predictive analytic tools to identify the standards and to eliminate the exceptions. In a first step the available engineering data (parts drawn or defined by engineering) has been run via predictive analytic tools to identify patterns and systematics in the usage of the definitions done. That has been further cleaned by statistical methods and we looked as well if we have predictions or forecasts on parts obsolescence.

In parallel the past consumption data has been analyzed and a normal forecasting system has been used to predict a simulated consumption for a future project. Basically mostly a linear consumption model scaled to the project size has been used.

Both approaches, the past consumption data analytics and the predictive analytics to find standard usages have then been combined and out layers have been analyzed, grouped and either removed or integrated. In the end we applied a cluster risk model (kind of ABC risk analyze), to group the risk of failure of the prediction. Et voilà, you have a forecast. In my case for the next project.

The usage of the data depends then on a business decision/model. In a perfect world I would have waited until engineering had released the data and bought on demand. In my real world time is always short and the business decision pends on: do I risk a production stop, or how much is it worth to be ahead of schedule. The answer of this question has defined the decision which risk cluster will be purchased in advance. Depending on your break even in terms of risking to buy the wrong part versus the cost of a production stop and/or risk of late delivery of the project you pick the corresponding risk cluster to get purchased or rather wait until more definition are provided.

Once you have defined the model it is then relatively easy to reapply. I am working in a pure project driven environment, where we build custom products into changing platforms (aircrafts) with very long lead times. So a repetitive platform is used between every two to four years. I would deem such an environment as very difficult to work with such systems. So in other words in more structured and repetitive environments such techniques might even be much more worth. You can think about combining this with machine learning algorithm for stock replenishment or automated order management systems. So supply chain 4.0 gets much more real where the purchaser focuses on problem solving and management of exceptions instead of administration.



25. November 2019



Über CRAB Analytics

CRAB Analytics gestaltet die passende Supply Chain- und Einkaufsstrategie für klein und mittelständische Unternehmen und nutzt dabei innovative Algorithmen und Methoden zur Steigerung der Leistungsfähigkeit der Supply Chain.

©2021 CRAB Analytics. All Rights Reserved. Design by CRAB Analytics