Building a Data Confidence Blueprint

Blog /Building-a-Data-Confidence-Blueprint

Establishing data confidence requires applying operational risk-management principles to data management. To do this successfully, financial services institutions must go beyond the data itself and focus on three core areas: data governance, data management, and reporting. Careful planning and execution in these areas can make all the difference to an organization’s ability to move from a position of weakness to one of strength with regard to data.

Data governance. Inspiring confidence goes faster if there is a framework that includes organizational accountability, typically from a data governance council that includes data stewards and representatives of the business who understand the data infrastructure supporting their business. The council is the final arbiter of whether or not the level of data quality is suitable for the use(s) for which it is intended, so the business side must be strongly represented. The primary tool of this governance body is a data quality policy that sets standards and delegates authority for approving and managing quality.

Data management. With the data governance policy describing what must be done and who is responsible, the organization has a basic framework to use as it looks at the data management processes that drive how data quality is achieved. As with other operational risk efforts, this involves looking at the data from outside the process. Defining elements in a comprehensive, current data dictionary is critical. Data lineage must be traceable through various flows or supply chains, and any transformation of data as it moves through the flows must be documented and approved. Control points should be established so quality can be confirmed and issues reported. The framework also drives risk indicators and other data quality metrics.

Reporting. The first reporting category is operational. Typically, operational reports include little descriptive data beyond an “as of” date and some segmentation information and seldom address data quality. The report owner must include documentation about how the report is connected to the data management infrastructure and what is done to the data to produce the report. This documentation should be sufficient for an independent party to recreate the report and get the same results. It goes a long way if internal audit or another third party includes recreation of reports in their testing.

The second set of reports covers data confidence indicators designed to help the organization combat the “stuff happens” phenomenon. As much as organizations try to prevent data quality issues, the data environment in which most organizations operate is increasingly complex. It includes third parties, which may not have the same data quality standards, and human involvement, which introduces variables and is inherently flawed.

Given these factors, it is safe to assume that data quality will always vary and that the variance must be measurable. Data quality control reports, risk indicators, and other metrics should be presented with the related operational reports so the information consumers can understand the quality of the data they are dealing with at a given point in time and be confident that the overall quality is being managed.

As data volume, variety, and velocity increase, organizations have no choice but to embrace the data revolution. As they do, it’s important to remember that while quality is job one, it alone will not establish the confidence that comes from “showing your work”—i.e., installing data infrastructure to manage change, sustain quality, and support the organization’s performance goals in the short and long term.

Post Date: 10/30/2015

default blog image Michael Goodman

About the author

VIEW ALL POSTS
EXPLORE OUR BLOGS