Ensure Project Success

Each year, companies spend millions of dollars on new system projects, consolidations and migrations. Research shows the majority of projects go over budget and fail to achieve the anticipated benefits. The primary cause – Bad Data!

Company

A mid-sized consumer goods manufacturer had embarked on a multi-year, $100 million dollar program to implement ERP and Supply Chain Management. The goal of project was to gain operational efficiency and standardize supply chain processes across the business units and geographies. The large project team included representatives from the various business groups, a Tier-1 systems integrator, and consultants from software vendors.

DvSum- Ensure Data Quality. Ensure Project Success.
DvSum-case-study

Challenge

Across the project efforts, a common dependency was data. High quality data was critical for the planning applications to generate feasible plans, and data was also vital to testing the flow of information and interfaces between the systems.

Unfortunately, the initial project focus had been on implementing the software and building the interfaces. Pre-scrubbed data sets were used, and only visual spot checks done to validate the initial phases. When it came time to test the system with live bulk data, the poor data quality resulted in poor forecast quality, infeasible production schedules and distribution plans that couldn’t be executed.

The project was put on hold and the team sat idle. A mitigation plan was put together outlining a delay to the go-live, and an additional $3M in project implementation fees.

Solution

With the focus shifting to data, we outlined a solution to address various aspects of the data challenge. A key issue for the extended project teams, was determining what was an interface issue and what was a data issue. With large volumes of data and errors effecting every aspect of the business, scalability was also a concern. How would they be able to validate all that data manually?

✔ Automated Data testing integrated to job scheduling

The long runtimes with the volume data, contained multiple errors and interface failures, which made it impossible to validate data and determine which were interface issues. Our solution integrated the data quality checks into the job scheduler, to capture both load errors and data quality exceptions.

Automated data validation enabled testing of bulk data very quickly. We saved time by stopping bad batch jobs and fixing data before continuing. The testing framework also isolated data quality issues from interface issues and distributed them to the appropriate teams, saving additional time previously spent sorting out dependencies and responsibilities.

✔ Automated regression interface testing

While data issues were being isolated, regression testing of interfaces was also completed. Reconciliation validation approach was chosen for transactional data interfaces like Sales Orders, Stock Transfer Orders. Aggregate summary validation approach was chosen for interfaces where the granularity of data is different between source and target.

Time was saved automating the testing, which also provided clarity around what was an interface issue and what was a software issue, and enabled system integrator and software teams to focus on issues they were responsible for and then resolve them quickly.

✔ Master Data Exception Reports

Common across the processes was Master Data, however the legacy system’s data had not been validated or cleansed for some time. Specific sets of reports and exceptions for Master Data were generated automatically and distributed to specific teams for Products, Customers, Material, etc.

Data teams no longer had to wait for interface issues to be resolved to determine Master Data exceptions. The reports enabled them to quickly prioritize and fix Master Data issues and then validate the data was accurate and flowing correctly across systems.

✔ Supply Chain Plan Quality Reports

While the integration and master data could be right, the various plans and forecasts could still be infeasible. Specific functional validations were put in place to measure and improve data quality based on the supply chain system and processes requirements.

In parallel to master data and interface validations, the supply chain teams were also able to confirm their specific plans and reports were accurate, feasible and optimal based on their goals. Supply Chain data quality quickly improved and was given sign-off.

Business value delivered

Our solution was able to isolate and coordinate data improvements on multiple fronts in parallel. Productivity and around key issues quickly improved and timelines accelerated. Data quality improved from 70% to 99% for the critical supply chain processes, and end user testing was passed with flying colors. The company saved the planned $3M project extension, and was able to manage and maintain high data quality with the new automated solution.

Ipad Pro Mockup

Unlock the full potential of your data with DvSum

  • Create a unified view of your entire data landscape on Day 1
  • Streamline data governance with automatic data classification and enrichment
  • Improve data accuracy with integrated data quality and cleansing
  • Empower business users to get data insights with no-code self-service data exploration