Standstill in Data Management? Metrics, Adherence, Meaning, Lineage and Quality offset solid progress on Data Governance

DCAM Data Management Benchmark only improves by 0.09 since 2015 to 3.22 in 2017

The EDM Council recently conducted the data management benchmarking study for the second time since its inception in 2015. The 2017 study continues to assess where the global financial information industry collectively stands against the requirements for sustainable data management as defined by the Data Management Capability Assessment Model (DCAM).

The results demonstrate that while we continue to operationalize data governance and policies, not much progress was made in making data more trustworthy and accessible. Financial institutions are still mired in tactical tasks such as the mapping of data from physical repositories to applications, or reconciling business glossaries to define the business meanings of data.

The benchmark indicates that progress in establishing mature data management is at a standstill. The average score across all 8 components of the DCAM is 3.22, a minor improvement of 0.09 since 2015 when the benchmark was at 3.13.

Out of the 22 statements, 15 received higher scores than in 2015, 4 decreased in score, 2 remained unchanged, and 1 statement was added since the 2015 survey:

20172015
1Our organization has a defined and endorsed data management strategy3.53.5
2The goals, objectives and authorities of the data management program are well communicated3.43.3
3The data management program is established and has the authority to enforce adherence3.53.4
4Stakeholders understand (and buy into) the need for the data management program3.53.5
5The funding model for the Data Management Program is established and sanctioned3.63.4
6The costs of (and benefits associated with) the Data Management Program are being measured2.72.8
7The data management program is sufficiently resourced3.23.3
8Data management operates collaboratively with existing enterprise control functions3.33.0
9Data governance structure and authority is implemented and communicated3.63.4
10Governance “owners” and “stewards” are in place with clearly defined roles and responsibilities3.53.2
11Data policies and standards are documented, implemented and enforced3.63.3
12The “end user” community is adhering to the data governance policy and standards3.02.7
13The business meaning of data is defined, harmonized across repositories and governed3.02.9
14Critical data elements are identified and managed3.33.2
15Logical data domains have been declared, prioritized and sanctioned3.23.3
16End-to-end data lineage has been defined across the entire data lifecycle2.82.7
17Technical architecture is defined and integrated3.2
18All data under the authority of the Data Management Program is profiled, analyzed and graded  2.62.7
19Procedures for managing data quality are defined, implemented and measured3.13.0
20Root cause analysis is performed and corrective measures are being implemented3.13.0
21Technology standards and governance are in place to support data management objectives3.23.1
22 The data management program is aligned with internal technical and operational capabilities3.23.1

 

4th Report from BCBS on the adoption of the principles for effective risk data aggregation confirms standstill

BCBS published its 4th report on BCBS 239 compliance “Progress in adopting the Principles for effective risk data aggregation and risk reporting” in March 2017.

The 11 principles were scored between 2.60 and 3.37 and only 3 principles (27%) received a score higher than 3.

P1P2P3P4P5P6P7P8P9P10P11
2.932.602.732.972.732.832.703.033.072.903.37

 

Similar as to the DCAM Data Management Benchmark results, the improvement seen in the BCBS progress report since 2015 is only marginal, a 0.05 increase of the average score to 2.90 with 3 principles receiving lower scores than in the 2016 assessment. Based solely on these scores, only 21% out of 30 institutions are in full compliance.

 

5 major problem areas have been identified by the 2017 DCAM Data Management Benchmark Study:

Metrics2.7The costs of (and benefits associated with) the data management program are being measured
Adherence  3.0The end user community is adhering to the data governance policy and standards
Meaning3.0The business meaning of data is defined, harmonized across repositories and governed
Lineage2.8End-to-end data lineage has been defined across the entire data lifecycle
Profiling2.6All data under the authority of the Data Management Program is profiled, analyzed and graded

 

Metrics, Adherence, Meaning, Lineage and Profiling (which the industry typically also refers to as data quality assessment) received the lowest scores out of the 22 statements.

The lack of measurement of data management operations, adherence, and data quality flags a key risk in the execution of a firm’s data management strategy.

This is the primary reason chief data officers fail to effectively communicate the value that their data office contributes to an organization.

Without a harmonized definition of business meanings the industry will never be able to unravel interconnections, automate processes, or manage linked risks. Work is underway, but glossary reconciliation has proven to be significantly more difficult and time consuming task than anticipated.

With thousands of applications and hundreds of data models, this challenge gets even more complicated. Terms must be mapped from physical repositories and linked to logical data models ultimately resulting in transparent lineage and data flows. But, as lined out in the BCBS 239 progress report and the 2017 DCAM Data Management Benchmark, financial institutions have a long way to go to define end-to-end data lineage.

The consequence is that data quality suffers because we don’t understand the rules, lack a common definition, and can’t map the business glossary to physical repositories where the assessment of data quality should actually be performed.

These 5 problem areas are the main reason why the implementation of robust data management seems to be at a standstill, and years away from completion.

 

Data management programs must be driven by or integrated into business initiatives

The foundation of data governance has already been designed to service the enterprise, but now we need to tackle the open challenges. We need to align ourselves more clearly with the business objectives and focus on the most important areas at each firm.

Therefore, we suggest that data management programs need to be driven by business initiatives, and fully integrated into these initiatives. This will help to scale to the level that is required to bring the problematic capabilities to maturity, while generating value-add for the most critical business areas.

Most importantly, it will fast track our progress towards trustworthy and accessible data for the business areas that matter the most.

What’s the better approach to data management maturity – boiling the ocean? Or focusing on a specific set of initiatives?

We are certain that firms with the better project management capabilities, and an initiative-driven approach, will lead the scoreboards of BCBS 239 compliance and data management maturity in the future by a large margin.

We shall see the next time the EDM Council conducts a round of assessment on the state of data management in the financial industry with Pellustro.

A market commentary provided by

Thomas Bodenski, PartnerElement22

The opinions expressed are as of August 2017 and may change as subsequent conditions vary