Garbage in, garbage out. We all know this as one of the original lessons and persistent problems of our electronic world. It’s a phenomenon that continues to plague the financial services industry in particular as companies struggle to move from a pre-Internet, siloed infrastructure of limited transparency to a post-crisis digital environment of mounting customer expectations. Banks find themselves squeezed between the infrastructure of yesterday, the regulatory environment of today, and the technology of tomorrow.
This squeeze has led to a crisis of confidence that poses major obstacles to future success, the top three barriers being:
The new regulatory reality. Following the financial crisis, the regulatory environment changed dramatically. Regulators now collect detailed data from the largest banks, conduct their own analysis, and draw their own conclusions. If the data provided for that analysis is incomplete or of poor quality, the conclusions may be wrong, which can result in more severe regulatory findings. Regulators also expect financial institutions to do more with their data and are passing regulations focused on improving data quality and applying more sophisticated analysis to manage risk. For example, as part of the Dodd-Frank amendment, the Comprehensive Capital Analysis and Review (CCAR) requires banks to use their data to stress test their portfolios to ensure they have enough capital. The review also considers data quality in determining whether or not a bank passes.
The digital world and big data. The Internet revolution has sparked a data revolution, and businesses large and small are struggling to cope with the volume, velocity, and variety of data it has produced. A large enough challenge on its own, it is compounded by the market’s expectation for increasing speed in process execution. Customers, investors, and analysts expect immediate satisfaction to their every request. They are also becoming more sophisticated in their understanding of the use of data. A common complaint in the age of Google and Amazon is “Why can’t you do something if they’ve been doing it for years?” Customers are expecting their banks to “know” them better than ever, yet they have high expectations about the level of control they should retain when it comes to their personal data.
The fintech revolution. We often hear about the thousands of fintech startups that are poised to revolutionize banking as we know it. Each of them is eager to take market share from traditional banks in one or more areas. Not surprisingly, banks aren’t eager to give up market share, but they recognize that these disruptors are driving customer expectations and they must adapt to survive. As a result, we are seeing a variety of partnership models emerge. In some cases, banks are acquiring fintech startups. In others, they are forming partnership agreements. We’re also seeing banks developing APIs that will afford them the flexibility to integrate with a variety of partners as the fintech landscape evolves. But all of these models depend on banks having the ability to exchange data with partners in a controlled, secure, efficient manner.
A common denominator
The key to overcoming all these challenges is data. It’s the foundation for success in an unpredictable future. Without data confidence, executives, customers, and regulators all recognize that banks will not be able to comply with regulatory requirements, leverage emerging technologies, or compete in the fintech world. For banks to thrive in the digital age, they must achieve a data management maturity level that far exceeds previously accepted standards.
Pathways to data maturity
NTT DATA is actively working with clients to tackle these daunting data maturity challenges—and it can be done successfully with the right roadmap in place. This roadmap has two major paths: #1 is the path to operational excellence, and #2 is the path to unlocking the hidden value within the data.
Pathway #1: Getting the operational basics right
As with most efforts, getting the basics right is a critical first step, and the path to data management maturity is no exception. I covered how to prepare your organization for taking on this challenge in an earlier post. I recommended bringing three areas together—data stewards, data management, and data technology—to work toward delivering data confidence by improving data management maturity. These areas must first work together to achieve operational excellence by improving data management capabilities at the conceptual, logical, and physical levels. Guiding work at all levels requires a clear data strategy and data governance model. This sets the tone and direction for the data management efforts that follow and drives drive the policies and procedures the organization will use to deliver against the strategy.
Core elements of pathway #1 are the data quality and data operations efforts. Coordination with your data stewards across business units is critical for success, as data quality is frequently a function of the business policy and process that drive its generation. The business must define “fit for purpose” and then measure data quality against those definitions. Also critical to success are the architecture and infrastructure efforts to support operational data maturity.
The pathway #1 journey is considered successful when the elements of modern data management are embedded in the organization. Data quality is managed and metadata is complete, managed, and properly leveraged. The operating model is in place, participants are fulfilling their roles, and the infrastructure is understood and controlled.
In addition to preparing the organization to take the next steps, the pathway #1 journey delivers incremental benefits by satisfying regulatory requirements and reducing the time and expense of 1) responding to regulatory findings and 2) addressing the volume and velocity of data common in today’s banking environment.
Pathway #2: Unleashing the power of your data
Having achieved operational excellence, an organization is ready to take the advanced pathway toward unlocking the hidden value in their data. A key component of this approach is the data lake architectural pattern, in which a wide variety of data is stored in a lake and then curated for consumption. The curation is dependent on the metadata and data governance foundation that was established in the first pathway. It is also dependent on a solid infrastructure to ensure that security, quality, and a myriad of other problems don’t turn the lake into a swamp.
Part of the data-curation process includes using transcription, recognition, and natural language processing (NLP) to streamline the ingestion and curation of unstructured data. Statistics show that much, sometimes most, of the data in an organization is unstructured, so leveraging these techniques will increase the volume and variety of the data being processed. Done well, they will also increase the information and insights that can be gained from that data, helping banks solve some of the problems that have plagued them for decades. Unlocking the value of the data they’ve been collecting for years will enable banks to “know” customers at a new level. They will be able to look deeper into their own organizations and gain valuable insights into how bank representatives deal with customers and each other. And behind the scenes, they’ll be equipped to reduce the process friction that drives high headcounts, long lead times, and customer dissatisfaction.
Looking to the future
It is becoming more common these days to hear bank leaders talk about how their organizations are becoming tech companies. In this digital age, that is an appropriate goal, but without a mastery of the data within their ecosystems, all the technology in the world will prove useless at best and damaging at worst. Without good data to drive advanced analytics, we’ll just have a higher volume, variety, and velocity of garbage going in—and more advanced garbage coming out.
Post Date: 2016-09-05