Regulatory Compliance

The key to data quality lies in the mind

3-minute masterclass
Data quality blog Projective
Data quality blog Projective

Data quality initiatives within Financial Services sector often take too long and do not deliver the expected results. IT pushes tools into organisations that are underused. Business initiatives seem ineffective and the desired benefits do not materialize. But why? Cultural factors are often underestimated in data quality initiatives even though these tend to be the most difficult hurdles to tackle.

From our own experience and based on European Central Bank’s (ECB) research it is clear that data quality at financial institutions needs to improve without which client relationships suffer. Data quality issues also drive up the surcharge on regulatory capital through the measure of conservatism (MOC) framework. Furthermore, the ECB has started waving its penalty book to the unwilling and the slow. The business case is clear, so, what’s keeping organisations from acting on it?


This might also be of interest to you: BCBS239 – Tips on how to get your next IT-development or application PERDARR ready


The human biases

Out of the many human biases two are of particular interest: the ambiguity effect and system thinking (also referred to as system justification). Both are enforced by the structures prevalent in most financial entities. Strong governance (bureaucracy), and organisational seize, both are accelerators for these biases, as well as the organisation complexity that increases distance to the customer. These create considerable headwinds for any data quality initiative and rapidly erode any progress made.

The ambiguity effect

When the outcome of a data quality initiative is ambiguous, people are less likely to pursue it. Even less so if obstacles need to be overcome, like aligning reference tables across
business domains, or data conversions to correct past wrongs. Improving the data lineage in any system is just impacting a small piece of a longer chain. Until the entire chain has been fixed, the spell of garbage in, garbage out still holds. The additional work is an easy target for budget stress. Nobody will notice the poor data traceability until other systems have been fixed as well.

Good data lineage requires a full state machine and it is not just the happy flow that needs to be implemented and tested. This can easily double the implementation costs, but
has the benefits of a more stable system with  better diagnostics. However, projects generally already struggle enough to get to the finish line, to care much about the pain during operations or future corrective maintenance caused by dropping a few data quality requirements. Noticing that much of the struggling and unpredictability in the project is due to the poor diagnostics in the first place which is impeded by the next bias discussed.

System justification

But why change? We’ve always been doing things this way. Moreover, the ambiguity does not help as it only adds to the reasons why change is not worthwhile. Did it not always work in the end? Are the late-night pizza sessions only savoured memories that draw compliments for effort? Either way, we know that getting quick and dirty results is appreciated in our corporate culture, where dirty is often overlooked, or hard to translate in decremental business impact.

For instance; 98% of correct data matches sounds good. All call agents being presented the wrong customer portfolio every 2 hours sounds very bad. Being able to translate the one number into the other (yes, they do represent the same reality), requires business process insight that is often located somewhere at an unreachable distance from a development team.

Building on your data capital with the right mindset

Can this paralysing embrace of mutually enforcing biases be broken? Or are institutions doomed to never live up to regulatory standards nor reach a position where they actually benefit from their data capital.

Firstly, we’ve seen that creating more transparency and understanding the ‘dirt level’ helps in breaking down unconscious behaviour. This will only work if management is ready for a long-term effort to keep on asking for spreads in numbers and coverage ratios. Just always remember; when somebody cannot tell you whether a number is right or wrong, it is not worth looking at the number. And only in Utopia is everything always 100% accurate.

Second, when management and employees become more aware of the cross-contamination effects of poor data quality, tolerance for poor data quality will diminish. Poor data quality is not passed on from one system to another, it multiplies each time data from multiple sources are combined. With the current data quality levels encountered at banks a few combinations are enough to make any data set utterly useless. But don’t lose hope, with some knowledge and persistence these biases can be defeated and simultaneously build data treasures.


More information on these biases can be found here:

Ambiguity effect The tendency to avoid options for which the probability of a favourable
outcome is unknown.[10]
The tendency to defend and bolster the status quo. Existing social,
economic, and political arrangements tend to be preferred, and alternatives disparaged,
sometimes even at the expense of individual and collective self-interest.
(See also status quo bias.)