|
Data standardization – the process of developing and implementing rules for how data are described and recorded – offers significant benefits for financial firms and regulators. Well-designed standards can reduce reporting burdens and enable firms to better track and manage risks. Data standardization can also make it easier for regulators to understand where risks lie, see interconnections within the financial system, and identify trends in a timely fashion. Moreover, as vividly revealed during the crisis, the information gaps that arise in the absence of standards can exacerbate financial fragility. All of this suggests data standardization should be easy.
Shifting from abstract, incentive-based analyses to on-the-ground observation reveals a very different story. Simple data standards widely endorsed by public and private actors have been only partially implemented. The process of devising new standards to address recognized needs remains far behind the optimal level, no matter how measured. Meanwhile, the topic has received only scant attention from academics or third-party monitors like the financial press.
Authors identify myriad frictions favouring the status quo, including regulatory fragmentation, upfront costs for delayed gains, and the need for coordination among numerous and diverse actors. That these factors are well worn does not reduce their grip, and may even exacerbate the tendency to focus on other more exciting topics.
Authors further suggest that the first-order importance of these banal forces in explaining the gap between optimal and actual financial regulation is not unique to data standardization. Those on the left often blame inadequate financial regulation on capture. Those on the right, meanwhile, decry excessive regulation as the key impediment to growth.