40 Trillion Gigabytes Force Financial Institutions to Rethink Data Management Programs

Data is increasingly becoming a force that is driving financial institutions to change the way they think; about data itself, about the way they manage their businesses, and about the future. The sheer volume and availability of both structured and unstructured data has created an entire digital universe. The explosion behind this massive growth has been coined ‘big data.’ Big data seems to have taken on a life of its own.

The digital universe is expected to double every two years between now and the year 2020. According to IDC, by that time the data universe will contain 40,000 exabytes (or 40 trillion gigabytes). That’s more than 5,200 gigabytes for every man, woman, and child. IDC projects that by 2020, data’s mass will have grown 50-fold over the course of a single decade.

Not only has data become massive in scope, but innovation is also creating new types of data that will impact business, money flow, security, and almost every other aspect of our economy in ways we can’t yet comprehend.

Rough estimates from Twitter and Facebook note that every 60 seconds, over 340,000 tweets are exchanged and over 3,300,000 pieces of content are being shared on Facebook. Tumblr states that, as of this writing, there are over 174,300,000 blogs online. That’s not only on the consumer-driven side, but also on the business-driven side.

Information flow from unstructured data is playing a key role in the financial industry, as well. The exponential growth in both data volume and velocity is forcing financial institutions to make radical changes in the way they deal with data. They are facing change almost everywhere they look, from the technologies they are using to the processes and structures through which data is managed. Data vendors are rising to the demand by creating innovative new products such as sentiment indexes, news analytics, or even outsourced quantitative analysis via factor feeds. These new products are creating yet another set of data that will have to be included in the scope of every enterprise data management program.

From a structured perspective, we have enterprise data management programs that focus on reference data, market data, and operational data with little attention to or coverage of analytical data such as quantitative factors, supply chain information, sentiment indexes, etc. Typically, this data is decentralized and left to the investment, analytics, trading, and risk teams. The growth here is primarily being driven by the digitization of the financial industry. Key data vendors are also significant drivers as more data are added and additional types of data are provided. Things are changing so rapidly, traditional approaches are no longer adequate for processing or managing data much less analyzing it. The volume of information will only continue to grow and time to process will only continue to shrink.

This exponential growth seems unwieldy and impossible to manage. Not only that, the potential for utilizing this data and transforming it into an engine for growth and optimization is only beginning to be considered.

A Financial Times report from the most recent World Economic Forum in Davos highlighted an entirely new world of challenges from the growth of data and the economic implications from that growth. The ability to process data in a way that yields utilizable knowledge has been a key topic among executives of leading financial institutions.

The report quoted the chairman of a large universal bank in reference to the financial industry, “This is the most digitizable industry you can imagine so I am surprised that the banks haven’t recognized that the real challenges come from technology and not from other banks.”

With increases in financial regulation from the Dodd Frank Act to Basel III on every institution’s plate, it is hardly surprising that bankers have had little time to grapple with the next big challenge: how to harness the digital revolution that is about to overhaul the financial industry.

Yet they can no longer afford to put off addressing these issues. The growth is happening. It is already challenging existing analytical platforms because their centralized data management programs do not cover these new data types and their uses.

With the growth of data already seemingly overwhelming, how can firms begin to address these challenges when they are still just trying to catch up? Where do they even begin? And how, from this chaos, can they develop an organized, pragmatic approach for capitalizing on what is already proving to be a critical engine for enterprise growth?

The first step is to perform an assessment of their current situation. A firm’s current state of data management should be analyzed from multiple perspectives—from corporate culture and the existing governance model to sourcing and data quality assurance to technology architectures. Ideally, a formal framework such as the EDM Council’s Data Management Maturity (DMM) model should be utilized to assess data management maturity from 11 different perspectives. Institutions that have a certain degree of maturity in data management should assess each dimension for each uncovered data type.

The information garnered from this assessment will define the data management program that will best serve that institution in the future. From this target program, the firm will begin to understand what will be required for future enterprise data management operations. Once these two steps have been accomplished, a plan of action can be put in place to cope with the massive new challenge even as firms dedicate the majority of their resources over the coming years to complying with changes in regulatory requirements.

In fact, the firms who will flourish in this new data-driven environment will be those who take action now; those who do not act now will lag behind.

The industry is developing a set of standards to use in benchmarking data quality and for utilizing data to improve operational, tactical, and strategic decision making. These standards will continue to evolve as technology and data quality improve. But the trajectory toward the ideal data management is a long one.

In the meantime, financial organizations can become proactive in the process by gaining transparency relative to their own developmental state. Understanding how their processes stack up against industry best practices will enable them to identify and subsequently execute initiatives to improve data management maturity.

In 2011, Element22 took steps to expand industry capabilities by participating in the EDM Council’s effort to develop a standard industry framework for data management maturity. The EDM Council’s Data Management Maturity Model (DMM) provides a benchmark institutions can use to assess their data management process maturity and then identify appropriate steps for improvement.

The assessment process based on the DMM model facilitates building a foundation. But as with everything else, institutions must view the establishment of such a foundation as only the first step in an ongoing process rather than as an ‘end-all’ solution. Setting up a program for consistent evaluation of data management maturity will equip firms to not only deal with massive data growth but also to optimize it for the benefit of their enterprises.

The ongoing digitization of new data and the increasing number of new data types is already causing a major transformation of the financial industry. Firms who step up to the plate and advance their place on the data management developmental trajectory will finish first.

A market commentary provided by

Thomas Bodenski, Partner, Element22
Predrag Dizdarevic, Partner, Element22

The opinions expressed are as of March 2014 and may change as subsequent conditions vary