In 2007, as Lehman Brothers’ collapse set off a chain reaction throughout the banking industry, regulators began going down to banks to look at their risk exposure as they sought to get an idea of the potential damage that could be done. Something they should, arguably, have done some time before.
Banks struggled to put this information together, highlighting a complacency and malaise that likely exacerbated the problems of the crisis. Lehman Brothers’ collapse heralded the beginning of a new era of regulations, though, with Dodd Frank, which was introduced in 2010, and Basel III in 2011 among the most far-reaching and complex. Over $100 billion in fines have been paid in US for non-compliance since 2007, and with a new Republican-led regime entering power, it is unclear what the future holds.
The time and cost of regulatory compliance and reporting vastly increases with every new regulation. Regulatory bylaws must, by their very nature, be thorough, and many contain hundreds of pages of information. Keeping up with these causes additional stress to financial services institutions, at a time when new competition from FinTech is creeping up the sides.
Each new industry regulation and associated deadline creates an influx of new data that has to be stored and analyzed, and garnering insights from it rapidly is vital for streamlining, optimizing processes, and pinpointing any potential problems areas.
The Basel III framework, for example, largely focused on capital issues, ensuring that banks had the capital necessary to cope with any shocks. It also detailed the bare minimum data standards large organizations should be looking at. Analytics is vital for optimizing both the balance sheet and risk management to ensure that such rules are abided by.
The banking industry has grown substantially over the last 20 years because of mergers acquisitions, focusing on revenue generation as opposed to development of data infrastructure, leading to complex legacy systems which have resulted in many failures. There is now, however, a significant amount of regulation around how data is managed, adding an additional layer of complexity to the process. Banks need their data management to be faultless, and data quality also needs to be at an appropriate level to eliminate discrepancies that could exist in various information points, helping to stop discrepancies from occurring the database in the first place. Financial services firms need to ensure that data quality standards - completeness, conformity, accuracy, duplication, and integrity - are all carried out.
Analysis of this data is also potentially aided by advances in cognitive computing, which can help financial institutions manage as data increases exponentially in volume over the next few years, meaning that data-driven discovery and decision making even as analysis evolves beyond human capabilities, enabling them to focus reviews on key risks and threats and either identify them early, or prepare a legal defence should the regulator discover them first. With well managed and high quality data, financial services organizations can respond rapidly to regulators requests and ensure they are always one step ahead.