With corporate environments constantly shifting, companies rely on enterprise resource planning (ERP) systems to manage day-to-day business activities, provide strategies for cost-saving and support vendor relationships. While many companies with multiple ERPs are highly efficient, in an ideal world a company has a single, enterprisewide ERP system in place. This ideal provides users full visibility and accountability of the business to help it maintain its competitive advantage. The key to success is consistency across core business processes.
As companies migrate their systems or merge their master data, the integrity of the information can decrease or become siloed. Data issues can cause slower ERP system performance, poor end-user satisfaction and business decisions, and even data corruption. Data integrity issues are also costly. According to the Harvard Business Review, the yearly cost of poor-quality data in the United States alone in 2016 was $3.1 trillion. Additionally, the MIT Sloan Management Review, reported that the cost of bad data is an “astonishing 15 to 25% of revenue for most companies.”
The Cost of Poor Data Quality
- 50%: The amount of time that knowledge workers waste in “hidden data factories,” hunting for data, finding and correcting errors, and searching for confirmatory sources for data they don’t trust.
- 60%: The estimated fraction of time that data scientists spend cleaning and organizing data.
- 75%: An estimate of the fraction of total cost associated with “hidden data factories” in simple operations.
Source: Harvard Business Review
An investigation into to why Target Canada’s supply chain collapsed when its new ERP was deployed revealed that 70% of its master data was riddled with inaccuracies. Through a process that is already costly, challenging and resource-intensive, the integration and consolidation of various ERP systems and platforms into one should offer streamlined functionality rather than data issues. And the project should be a key enabler of strategic decisions and tangible business outcomes. To effectively manage and mitigate ERP system issues and realize the maximum return on an ERP investment, a company must improve, standardize and centralize its master data for integration. This is essential before upgrading, migrating or merging multiple ERP systems. Businesses should profile, cleanse and establish baseline data governance to ensure their ERP system will deliver accurate, reliable and consistent results. This proactive data migration will minimize downtime at go-live and reduce the duration of operational disruption as well as associated costs.
Seven Best Practices for Data Optimization
- Define the data migration strategy: Data migration is required to support ERP installations or conversions and managing multiple systems requires data normalization. This data work typically part of a larger project deliverable, but providing proper care of this content is an excellent opportunity to restore confidence in the ERP data. This will also ensure a quicker turnaround time for finding essential data and going live on schedule.
- Adopt taxonomies and attributes: Master data taxonomy, also known as a data dictionary, is essential for optimized data. There are some clear rules and patterns that should be imposed on master data through the data dictionary, like the normalization of manufacturers, suppliers and units of measure (nouns, modifiers, attributes and more). Most importantly, the master data taxonomy builds standard descriptors for the master data through dictionary templates specific to the particular materials or services to provide structured, consistent and complete data.
- Translate data for your global enterprise: To support global governance, language translation should be part of the migration strategy. The data taxonomy should exist in each target language required across the company so that all end users have the easiest experience possible understanding the data.
- Standardize data: Once the data taxonomy is defined, regardless of structure, type or format, source data intended for migration should be validated to ensure it’s accurate, consistent and easy to understand; provides all of the required information, is within the accepted parameters for the business; can be easily accessed and integrated into the target application; and complies with regulatory standards.
- Analyze and deduplicate data: Stakeholders should identify data sources and where data cleansing and deduplication are needed. This could include removing old product codes no longer in use or removing duplicate supplier accounts. The advantages of executing the data deduplication process during rollout, implementation and migration projects include reducing incorrect and inconsistent material and vendor master reporting, reducing the time and cost spent manually identifying duplicates later, improving the overall reliability of reports and analysis and enabling Integrated Risk Management 4.0 innovation and growth.
- Enable governance and control: Predefined processes and workflows should enforce correct taxonomies and data structures throughout the creation and detection of new items. This will ensure that the target system will perform effectively, eliminate workarounds, increase user acceptance and engagement, and provide a system that management is confident with.
- Leverage data to make better business decisions: With improved data, companies have increased visibility to support Operational Risk Management. Industry leaders who understand the importance of a sound digital strategy are able to reduce the impact of operational risk with a unified, consistent view of critical data to make better decisions that keeps their assets safe and productive.
Clean data is vital to ERP system success. By increasing data integrity with Master Data Management software, business operations can see reduced asset downtime and maintenance costs, decreased inventory, and increased return on investment.