Reporting solutions in fragmented ERP environments

Reporting solutions in fragmented ERP environments


Often Data Analytics is understood as the crazy science thing within the organization where without a PHD in mathematics or programming you hardly understand the topics. But how often do you connect this with down to earth basic functionalities like reporting? Actually somehow it is the crazy science thing, that delivers results where you did not expect them.

by Christoph Wiedmann


That happened last to me when I sat together with our Data Analyst and described the problem to get a proper reporting in an international organization, which uses several different ERP systems. Basically the goal was, not to have another Excel Macro driving a net of files, where the slightest disruption of the raw data collapses the whole construct, nor a push driven system, where just a few key people need to be on holiday in order to disable their areas of reporting. The vision was an automated connected system on a self-updating platform, which is easily accessible and not too specific so that a maximum of audience can use it. Preferably in a dynamic format, which means that data visualization, charts and Business Performance Indicators can be changed by the respective user or observer.

Such functionality is standard portfolio of any of the major ERP System providers and in a perfect world it would be the best solution to use those options. Unfortunately that requires either an installation of uniformed interface points between the different ERP systems, with one leading system which gathers, condenses and displays the needed data, or a change to a uniform system across the organization. Unfortunately both are usually requiring considerable funds and expert consultancy and the second option in addition an enterprise wide ERP strategy. Furthermore in my case the need was rather short term driven. After I got appointed to manage SCM EMEA and Asia I needed a tool to understand certain performance levels in the regions. To initiate and implement such changes to the IT infrastructure would have delivered answers earliest one year later.

The simple solution was basically to use the tools which are known for the analytics of Big Data. In our case R Applications. The licenses we already had out of the exercises on predictive analytics used to forecast material needs for large aircraft cabin completion projects. The costs for a desktop license are almost negligible compared to cost of changing the existing ERP system. Depending on, if the user wants to have dynamic reports (which we did not), alternative license packages would have been needed and costs may rise, but would still be considered as rather small investment. Alternatively Python Applications may deliver even better results.

With that baseline, the decision was done quickly to start the project. The charter has been split into three main phases. First the definition what we want to display, secondly how to gather data for the needed information and lately how to display it.

The definition of the business performance indicators which should be shown, or in other words what should be looked at was rather simple. There are tons of systems and definitions already available within the current microeconomic science community and there was no need to reinvent the wheel. So a short research of different supply chain measurement tools and models was done, where the final decision was taken on the SCOR® Model. From there a selection of Business Indicators has been defined. To shortcut a bit the process, it is advised, to distinguish the indicators and performance measurement between what is mandatorily required and what is nice to have. Based on this assessment a set of required and nice to have input data needs to be defined. From here the local research starts to explore which system, which location and which department can provide the required raw data in order to do the calculations. Preferably the data condensation is done by the analytical tools. That gives the advantage that data can be automatically retrieved from the defined source without human interaction. Additionally it reduces the likelihood of errors and mistakes and is anyhow the core mechanic of a big data tool.

The next step is now the programming of the algorithm to condense and calculate the data to the level required. The massive advantage of using Big Data tools is that these tools have been specifically designed to be as open as possible to different data formats in order to get as much coverage as possible. This design is the major advantage and key to solve the initial problem of having several non-homogenous data sources. It is recommended to have a quality check at this stage in terms of first raw results of the business indicators. Focus should be taken on, if the initial required statement of the indicator is met. Here it gets a bit tricky, as dependent on the calculation the algorithm is doing, the results are changing. For example if you calculate the “on Quality Rate” per site, and a site (A) uses the quarantine to store items until incoming inspection is done, whereas other sites (B, C, D,…) do not do this practice, a correction in the data calculation (cases of goods booked to quarantine vs. cases of all goods booked) for Site A is required. This definition and calculation baselines are most important and are the key to success of a proper reporting.

Last remains the visualization of the results. Mostly my policy is “simple is the best”, but if there is the opportunity to experiment with different visualization ways like plot areas, trend corridors, statistical distribution layers and so on I recommend to widen the horizon. Never the less first question again is who should use the data and especially read it. The way of visualization should be oriented by that. Is the audience wider I would recommend to use widely known and simple visualizations like column-, bar- or line-charts. Rather a general approach. Is the data for a small group of specialists more sophisticated types can be used. By the way not only the form of presenting, but also the colors used, size and placement in relation to the spectator, font types and sizes can be used to standardize, simplify or to highlight. Secondly the question on which platform the data shall be published needs to get answered. I wanted to have a platform which is cloud based and therefore independent from someone’s computer or server, and secondly available all times to my management team in order to help them running their businesses. The common tool found in our environment was Microsoft-SharePoint, which we used in the end. The publishing was as easy as web-publishing and the access rights management is very convenient. The data was uploaded in an automated way to the defined pages and available on a monthly recurrence, which was enough for my use. For sure higher frequencies or even dynamic solutions could have been built, but would not have added additional value to the intended use. I rather took some more focus on the clean and modern design of the SharePoint sites as I work better in a convenient and pleasing environment.

Bingo, by closing the last step I got a SCM business indicator suite for the whole region, across 4 ERP platforms, across 7 sites located around the world in 4 time zones, embedded in MS Share Point which is available to the whole enterprise, but gave me the opportunity to define the access rights to the group of people I wanted to access the data.


26. November 2019



Über CRAB Analytics

CRAB Analytics gestaltet die passende Supply Chain- und Einkaufsstrategie für klein und mittelständische Unternehmen und nutzt dabei innovative Algorithmen und Methoden zur Steigerung der Leistungsfähigkeit der Supply Chain.

©2021 CRAB Analytics. All Rights Reserved. Design by CRAB Analytics