Imagine a loss of USD 3.1 Trillion! That is how much an IBM Study cited in an HBR article bad data caused the US, in 2016 alone! That is almost 17% of the US GDP in 2016!! Now let us put this in perspective and look at how hard companies have to work to improve their top line or reducing their costs by 17%. According to Gartner’s April 2018 report, How to Create a Business Case for Data Quality Improvement, “Poor data quality destroys business value. Recent research indicates that organizations believe poor data quality is costing them an average of $15 million/year.” If companies make data quality a priority and declare data to be an asset, they can avail themselves the benefits of this low hanging fruit. However, according to Transparency Market Research, the entire worldwide market for Master Data management (MDM) in 2017 was only USD 3.8 Billion indicating that companies are not investing enough in maintaining data accuracy. This leakage is continuously happening as we speak. By instituting a comprehensive data governance platform and declaring data to be an asset, companies can avoid being a victim of poor data quality.
The afore-mentioned article also talks about what causes poor data quality - points Triniti has made in Lean MDM, and Application Data Management and Master Data Management. Good data starts with good master data. When master data is on TRAC ( Timely, Reliable, Accurate, and Complete), then downstream processes that consume this data are clean. However, it is very important to address this problem the right away - so as to not only stop the hemorrhaging but also build a culture and an architecture of keeping data clean. If you do not do it the right way, you will not achieve the benefits of your investment in managing data as an asset and would have forever scarred your organization from engaging in such programs in the future.
Data cleansing products which eliminate duplicates are the focus of most data quality programs. Duplicate data (also referred to as record integrity) is only part of the problem and just one case. As an example, if you have duplicate customers in your CRM, you may be sending the same promotional material by mail or email to your prospects or customers. While the material cost (especially mail) is a waste, the annoyance you cause to the prospects and customers is a more significant cost. This waste is a simple of example of bad record integrity. Attribute integrity is an equally big if not a more substantial problem. An example of lack of Attribute Integrity would be having an incorrect address for the customer in your ERP. You would end up shipping your product to the wrong address! Imagine the cost of that. Now consider all other attributes that have such consequences if the values are incorrect. CRM, ERP, SCM and HCM systems deal with 1000's of attributes relating to products, customers, vendors, GL accounts, Employees, etc.; One can only imagine the cost of poor attribute quality.
If you did not believe in the price paid and subsequent waste that poor data causes before, then we hope that the above explanations/examples helped in shedding some light. Now we will highlight an effective strategy to ensure clean data in your organization. While you can find copious documentation on the web and have extensive methodologies from thought leaders, analyst organizations, and MDM consulting companies, here are ten practical steps to execute an effective data quality program.
First and foremost, check data at the time it is created. This validation is the most common omission MDM programs make. They often cleanse data with data quality tools at the end of the business process to enable analytical reporting. While there are advantages to doing this, the business benefit is mostly compromised. You get no benefits in having a smooth flow of information during transaction processing such as in an order-to-cash process or a procure-to-pay process. The benefit is mostly confined to reporting and that too non-real-time reporting as cleansing introduces latency. So to maximize your profit invest in an operational MDM as opposed to an analytical MDM. You can learn more about it by looking up or reading this Gartner blog by Andrew White. Operational MDMs such as Triniti's MDM suite ensure data gets created clean.
Check data accuracy in real-time with third-party providers such as google maps, D&B, etc., to validate establishment names and addresses. You may be constrained in some cases, as the data capture may happen in POS systems or at marketing events. In such cases, the checks can occur as data is integrated to the operational MDM.
Set policies for attributes to ensure standardization and consistency across the entire organization. These policies will ensure that there will be no ambiguity on how different users who consume the data in downstream processes interpret the value in these attributes and act on them.
Enforce policies systematically where you can. Do not rely on users to follow these policies. As an example, if you want all your SG&A GL Accounts to be beginning with the number 5 and have 6 digits, enforce it such that no user, unaware of the policy, can make a mistake.
Assign owners to policies that require human research and or expertise.
Enforce security constraints on such policies that only those owners are allowed to create and update those attributes.
Use structured data as far as possible. Do not rely on unstructured data that only humans can follow. Here is a good example. Let us say for most customers the labeling requirements during shipping are "standard". However, for a few customers, there are exceptions. If you create a comment field and have the customer service rep who takes the order type in free form as opposed to having a field designated as "Labeling" with a list of values of "Standard" and "EIAJ" that is directly interpreted by another software program, then the chance for errors are zero. If you do not have a designated field, then the operator at shipping will have to pay attention, "interpret" and act on this comment amplifying the chance of committing errors.
Create contextual validation for attribute data where possible. This validation should be in addition to standard referential integrity that CRM, ERP, SCM and HCM systems already have. As an example, ERP systems let you pick the payment terms for your customers from a list of values. However, your organization may link the payment terms to a customer profile. Use this additional validation when you capture the customer master record and ensure that it is updated when the profile is updated.
Use a workflow engine as the traffic cop to route the master data entity. Once again do not rely on people. The error rate is always higher for humans compared to computers!
Finally, have data quality dashboards that stewards can use to monitor the quality of data and continuously improve the fidelity of data in the enterprise. These dashboards also bring accountability to the owners of data.
Triniti's MDM is built on this common sense approach. It is far less expensive than archaic MDM platforms. There is value in establishing governance boards, recruiting stewards, and having executive sponsorship. If your organization elects to have a chief data officer, all the better. However these initiatives take a lot of effort and resolution from top management. Moreover, senior managers are tied up in day-to-day activities with meetings and fighting fires. IT leaders meanwhile are chasing the next big hype such as cloud, big data, AI, and IOT. An elaborate governance model and a perceived expensive MDM program are slow starters in most organizations. In addition, they have had to be budgeted ahead of time. With the Triniti approach, you can start small with a single domain (Customer or GL Account) and implement in a very short time. It has the greatest benefit as it directly impacts transactions in your CRM and your ERP. It helps you build the business case for a larger MDM program. It deals with both record and attribute integrity and provides the launchpad for a comprehensive data governance platform.