TACKLING THE TRIM CHALLENGE – HOW BANKS Could possibly get THEIR DATA QUALITY PROCESSES UP TO SCRATCH

 TACKLING THE TRIM CHALLENGE – HOW BANKS Could possibly get THEIR DATA QUALITY PROCESSES UP TO SCRATCH

by Martijn Groot, VP of Product Management, Asset Control

 

The Targeted Review of Internal Models (TRIM) is underway and significantly impacting banks over the eurozone. TRIM is an initiative of the European Central Bank (ECB), designed to assess whether the internal risk assessment models utilized by banks supervised by the ECB, adhere to regulatory requirements and whether their results are reliable and comparable.

As area of the programme, the ECB is involved in a process of reviewing the banks' models, going for 'homework' to improve their processes, after which returning to inspect again. In achieving this, however, the ECB understands that detailed discussions using the banks about their risk assessment models is going to be of little value if they can't trust the data that is fed into them.

 

Data Quality Principles

TRIM can probably be said to build on the results of the Basel Committee for Banking Supervision's BCBS 239 document, published in 2021. While BCBS 239 organized 14 principles for risk data aggregation for banks to abide by, it was quite generic anyway. TRIM is more specific especially around data quality aspects and measurements.

In fact, TRIM supplies a range of governance principles for creating a data quality framework that covers relevant data quality dimensions including completeness, timeliness, accuracy, consistency and traceability.

To comply with TRIM, banks need to reveal that they can trace back the cost they have used historically for a model or for a financial instrument valuation through the data supply chain back to original sources. They also need to know what processes happen to be carried out on the data, including checks which have been conducted, what the sources are, what were the initial parameters and data quality rules and have they been changed over time? Traceability is the term used to describe this within the TRIM document but data lineage, effectively the information lifecycle that includes the data’s origins and where it moves over time, may be the broader term more widely utilized in the data management arena.

The TRIM document also contains important reporting guidelines – including that banks will need to report on how often they have proxied their market data inputs or risk calculations.

Doing this also effectively defines a process based on how the bank has derived and validated this proxy. Could it be really a comparable instrument? Will it behave similarly to the original instrument?

In other words, in line with the focus on data quality in TRIM, it is important that banks are regularly validating their proxies. Finally, and in line with this focus, and also to ensure they have a better grasp from the quality of the market data they use in risk calculations, banks need to ensure they have a handle on how much data is stale per asset class.

Typically, today most banks would struggle to comply with many of the data quality guidelines that TRIM puts in place. Most have no data quality or control frameworks in position or, at best, assess quality in different isolated silos. As such, they don't be capable of report daily on key data and metrics. They've already implemented checks and controls but generally they've little real insight into data across the whole chain. Very few possess a full audit trail in place that describes how data flows from sources through quality checks and workflows into the financial models, and that does not just track data values but also the rules and the rule parameters that are acted on it.

 

Finding a Way Forward

So how can banks effectively meet the TRIM guidelines? Banks first want to get the basic processes right. That means putting a robust data governance and knowledge quality framework in place. To do that, they need to document their data management principles and policies. They also need to agree on a common data dictionary and understand more clearly exactly what they are measuring, including the way they define financial products across the group and also the control model for the whole lifecycle.

The next phase is putting in place the technology that allows banks to achieve this. Organisations first need a data management system that has the end-to-end capability to gather, integrate and master key data, derive risks and publish them to different groups. Which should provide banks with a single funnel and consistent group of data and data quality metrics that support TRIM compliance.

It is worth highlighting too that there are benefits available for banks that achieve all this that go beyond simply complying with TRIM – important though that is. Some of the remediation that they will have to do to comply will also be necessary for key regulations, including the Fundamental Review of the Trading Book (FRTB). However, for a lot of, TRIM is their current focus and with the programme expected to run 2021 only, banks know there is still much work to do in order to meet its guidelines.

Related post