Tobias Fischer, Author at 麻豆原创 News Center Company & Customer Stories | 麻豆原创 Room Tue, 02 Feb 2021 08:15:03 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Data Quality: The True Measurement of Digital Transformation /2021/02/data-quality-measurement-digital-transformation/ Tue, 02 Feb 2021 11:15:40 +0000 /?p=182776 Despite the immense attention and investment paid to digital transformation, most businesses still miss the most critical part of their evolution: data. It doesn鈥檛 matter if artificial intelligence (AI), process automation, bots, or predictive analytics is adopted. Without high-quality data, these core technologies can never truly benefit any company.

If your company has yet to catch on to the importance of data quality, it鈥檚 not too late. Though Big Data has been around since the early nineties, the value and strong focus on data has increased drastically in the last decade. Since then, some organizations have undergone a of work realignment, process refinement, and business model innovation to reap the benefits of this information gold mine.

Companies that leverage their data want to create more value for their customers and themselves. But the is broad and depends on a defined set of strategic goals, technology adoption, and data operationalization. In most cases, businesses that cite 鈥渧alue鈥 as the efforts really mean boosting efficiency, optimizing supply chain flows, or using economies of scale to improve customer relationships.

So, how can organizations use data effectively? According to , 鈥淵ou can鈥檛 manage what you don鈥檛 measure.鈥

Clearing the Path to Pivotal Change

To measure data use and find areas for improvements, many companies have created a new role within their C-level ranks 鈥 鈥 over the last eight years. This executive is tasked with defining the company-wide data strategy, including controlling, governing, and managing data-driven engagements to shape the business into an intelligent enterprise. One operational example is the cleanup of inaccurate, incomplete, and duplicated information 鈥 also known as 鈥渄irty data鈥 鈥 residing within the digital infrastructure.

Gartner describes 鈥溾 as information that is inaccurate, incomplete, and pervaded of duplicates, impacting customer turnover, expense management, sales opportunities, and back-office functions. Therefore, companies should address this issue with the following practices based on a part of Gartner鈥檚 basic principles of data quality management:

  • Consistency: Data is stored in one or multiple locations with equal values.
  • Accuracy: Data value is consistent across the target model.
  • Validity: Data values are within a certain predefined range or domain.
  • Integrity: Relationships between data values are complete.
  • Relevance: Data holds the right information to support the business.

Improving data quality to the point where any digital transformation gains a beneficial edge is like losing weight 鈥 it takes special effort to attain it and consistency to maintain it. With a focused mindset and healthy habits, companies can leverage their data to stay relevant and financially stable with room for future growth and new business models.

Managing Data with a Principled Approach

While approaches to assessing data quality are numerous, some methods work better than others. For example, many companies use the Friday afternoon measurement method, known as . Using FAM, one assesses the most common errors within the last 100 data records, such as sales orders or business partner information, on a paper-sheet table to derive an improvement strategy for corrupted data sets.

FAM may not be the best solution for most organizations. We recommend an online tool, namely a data quality dashboard. This online tool provides real-time insights into a company鈥檚 overall data health, uncovering all inefficiencies across different domains to be addressed by the CDO organization in one location, millions at a time. A rules framework combined with innovation technologies, such as machine learning and intelligent robotic process automation (iRPA), provides real-time insight to action across all systems in which master data is stored to restore inconsistencies.

1. Data consistency is evaluated in terms of quantity, not quality.

To enhance key performance indicators for data quality, organizations should consider a specific master data domain, such as material master data, against its availability in cloud-based or on-premise systems. This could be helpful if master data is scattered across different landscapes.

Master data records must also be checked for duplicate values based on the variety of their attributes. Duplicates, followed by missing entries, are the most common occurrence within corrupted data. To support the duplicate search, further aspects, like data accuracy, need to be taken into consideration.

2. Data accuracy is measured by the format and content of defined data sets.

The use of country-specific data formats 鈥 such as the structural difference between European and American dates 鈥 can significantly impact a company鈥檚 ability to deliver tangible outcomes. Decision-makers can never assume that data means the same to everyone.

A popular use case for verifying content for accuracy is the handling of workflows and abbreviations. Following logic like 鈥渟hipment date must not come before the order date鈥 can help ensure the dates on specific activities communicate the same insight to everyone involved in the delivery process.

Further content-related accuracy checks are necessary for homogenous data sets. Abbreviations should be set to a company-wide standard. For instance, in the 鈥淐ity鈥 field, 鈥淣ew York鈥 must be appear as 鈥淣YC,鈥 not 鈥淣Y.鈥 Such checks will be useful also in the duplicate search, once applied.

3. Data validity checks are a recurring task.

As a byproduct of manual processes or reorganizations, companies often have addresses that are only up to date for a specific time frame. A dedicated business rule can frequently check time frames against a specified date, mark affected records, and alert the data quality responsible to derive actions. Such checks need to be done on a frequent basis.

4. Data integrity is assessed to identify data sets with a recurring pattern.

Key performance indicators for meaningful data quality should consider data integrity patterns from all master data domains. For example, detecting a recurring data pattern could be as simple as programming a required field as mandatory to capture necessary insights. However, a more complex scenario may also be applied to classify a material code to a specific characteristic category or plant location. Patterns within the integrity key performance indicator (KPI) must be of major weighting within the overall data quality KPI.

5. Data relevance is achieved through continuous maintenance.

After setting up rules for deriving meaningful data quality, the dashboard setup still requires attention to help ensure users can drill down to the line item level for dynamic, real-time, and actionable insight. Directing analytics results generated by the dashboard to the maintenance layer allows CDOs to easily improve data quality by opening service requests. Furthermore, user feedback from service requests can serve as input to improve rule composition. This feedback should be the starting point of a cycle to resolve and refine the rules and data issues.

Focusing on Data Improvement and Consistency at Scale

Every company needs a single source of truth to pinpoint their weaknesses and inefficiencies. Fortunately, innovations in mobility, artificial intelligence, process automation, bots, and predictive analytics are making data creation and editing more efficient and convenient than ever before.

But to be truly successful, businesses must take a quantum leap in data quality with the assistance of tools such as a CDO dashboard. Why? Because data quality is the heart of every digital transformation and a must-have for every company pursuing an increasingly digital marketplace.


Explore how can help energize your business鈥檚 digital transformation with a flexible and scalable rule framework for data quality management.

Stay in the conversation by following 麻豆原创 Services and Support
on , , , and .


Pascal Angerhausen is business transformation lead for CIO Advisory at 麻豆原创.
Tobias Fischer is data architect for CIO Advisory at 麻豆原创.

]]>