For supply chains to be more responsive, plan generation needs to become faster. Consistent on-time delivery of system loads and operating plans is a key metric for internal and external stakeholders, and impacts their ability to respond to customers.
A $14B high tech manufacturer was growing market share and made a number of acquisitions which included new factories and product lines. The supply chain team was focused on generating production schedules for 60 factories and rolling out a supply chain planning application to the new factories.
On-time delivery of plans was critical for manufacturing, as well as the team’s responsiveness to customers. Unfortunately, plans were only produced on-time 60% of the time. Each hour the plan was late, was costing millions of dollars. An internal study quantified the cost of delays to be $20M a month.
Initially the company’s focus was on upgrading hardware to run the optimization engine and updating database processes. It all helped some, but wasn’t enough. We then focus
ed on fixing bad data and optimizing the data flowing through the system. We developed a solution to address the multiple challenges which delivered impressive results.
Automated Data Checks Integrated to Batch Scheduling
Batch runtimes are a function of processing time and reaction time when failures happen. We integrated data audits into the batch progresses, not as part of the ETL processes, but to alert support team during the load if data exceptions reached a threshold.
The parallel processes, and proactive notification alerted the team to errors hours before they previously would have found out. This enabled fixing errors before the plan delivery deadlines and also improved initial plan quality.
Data Exceptions & Resolutions Reports
Like most companies, planners spent 20% of their time validating and reconciling data. This was done manually after the plans were already distributed, which delayed execution and responses to customers. With each batch run, a data quality and plan quality scorecard was published with exceptions and potential resolutions.
The automated exception reports cut-down the manual data validation time of the business users and enabled faster planning and execution cycles. In addition to basic data checks, the plan quality reports enabled further insights, for example run-to-run variance and trends in customer orders.
Plan Quality & Runtimes Analysis
The Company’s initial approach was to represent the full detail of their various factories, and full forecast granularity. The goal was to produce the most optimal and accurate plan possible. However, planning engines are sensitive to different aspects of data complexity, and more detail doesn’t always lead to better plans. Our solution provided analysis of plan runtimes at various levels of model granularity, and the impact on various planning metrics.
By comparing, model complexity, plan runtimes, and plan quality, we were able to balance the speed and on-time requirements to deliver an optimal plan. By providing data driven analysis, some granular data collection and processing was relieved. For example consolidating forecasts and aligning volumes to lot quantities. This led to faster run times and freed up team members to focus on more value-added tasks.
Business value delivered
With the daily quality checks and analysis in place, on-time delivery and plan quality rapidly improved. Over a 3 month period we were able to reduce daily workflow runtimes by 45%. On-time delivery of plans improved from 60% to 95%, which saved the company $19M a month in cost impacts.
Unlock the full potential of your data with DvSum
- Create a unified view of your entire data landscape on Day 1
- Streamline data governance with automatic data classification and enrichment
- Improve data accuracy with integrated data quality and cleansing
- Empower business users to get data insights with no-code self-service data exploration
Share this post: