|
The committee measured this progress — or lack of it — against basic standards for compiling and reporting risk data that were agreed in 2013. At that time, supervisors like me thought that a very reasonable deadline for full compliance was January 1, 2016.
But by 2017, only three of the 30 “global systemically important banks” were up to snuff while compliance rates across individual standards had barely improved at all. It will be surprising if even half of the big 30 are fully compliant by the end of this year. The Basel committee did not identify which banks had fallen short, but the complete list of 30 includes JPMorgan, Goldman Sachs, Barclays, BNP Paribas and Credit Suisse as well as four Chinese banks.
The Basel data standards are really basic. They say the big banks should always be able to paint an up-to-date, comprehensive picture of the risks they face. When Lehman Brothers collapsed in 2008, most institutions struggled to measure their exposure to it. That should never happen again.
Global banks need good data governance rules, rigorous data management and strong IT infrastructure. In many respects they are already in the data management business. Is it too much to ask that they have sufficient data to measure the operational and financial risks they are taking?
Financial technology is proliferating and algorithms are driving more and more banking activity. Current, accurate data are quickly becoming a necessity. The threat of cyber attacks only adds urgency to the need for comprehensive risk management. Nowadays, losses or disruptions in one corner of a big bank can infect the whole institution in no time at all.
While the banks generally have good intentions, they keep falling behind schedule. Some of their “reasons” sound like excuses: supervisors made the standards more specific; having multiple, old IT systems makes it hard to integrate information; and banking is changing so fast that IT can’t keep up. None of these really hold water.
The truth is that these banks have a deep and fundamental data problem. The definitions, ownership and rights to data vary enormously within big institutions, let alone across them. It is a huge job to rebuild a complex bank’s data and IT architecture. Moreover, management often seem not to care. The traders, lawyers, accountants and economists who run big banks aspire to be world class in many ways, but they often still settle for data that is only “good enough”.
Banks are not taking their knowledge gap seriously enough. This is partly because, when it comes to risk data, their supervisors have not been serious either. They see the problem, but all they have done has been to talk at banks rather than making them act.
Full article on Financial Times (subscription required)