Taking risks is vital to success in banking, yet for a long time it has been out of control. One of the primary reasons for the 2008 financial crisis was that bankers were incentivized to take significant risks by the rich rewards if they paid off and the relatively light penalties if they failed. The consequences of this were dire, costing the economy an estimated $22 trillion.
Given human tendencies towards greed and hubris, you would imagine that removing people from the equation would be a good thing. However, as Artificial Intelligence (AI) and machine learning algorithms are adopted more readily by the major financial services companies, many are skeptical as to whether the benefits are worth the potential downsides, and whether banks are using AI to absolve themselves of blame.
Gartner identified machine learning as one of the top ten strategic technology trends of 2016, with advances coming thick and fast across all industries in all fields. In 2015 alone, tech giants Google, Facebook and Microsoft invested over $8.5 billion on AI research, acquisitions, and talent. Major financial institutions - such as fund managers BlackRock, Two-Sigma and Renaissance Technologies - have poached some of the best data scientists in the world, and are spending big to keep up with Silicon Valley. And they’ve got the deep pockets to do so. Andrew Lo, Director of the Laboratory for Financial Engineering at the MIT Sloan School of Management notes how wide ranging the impact will be, saying, ‘I suspect that it’s going to transform all aspects of the financial industry because there are so many parts of it that can be automated using these kind of algorithms and access to large pools of data.’
In a survey by Synechron, Inc., a global consulting and technology innovator in the financial services industry, 71% of respondents said they believe AI will be hugely important over the next decade. A 2015 report from McKinsey & Company also revealed that a dozen European banks had moved from traditional statistical analysis modeling to machine learning, with many citing increased new product sales of 10% and churn and capital expenditure down by 20% as a result.
The most dramatic changes brought about by AI and machine learning are expected to be in trading, financial analysis and risk assessment. The speed at which decisions need to be made in these areas make it vital, and they afford a more in-depth risk assessment to help the risk analyst or underwriter to find information that may have been hidden, deliberately or otherwise, to ensure investments are the right ones. The amount of information available to analysts already far exceeds their ability to make sense of it, and only AI is fast and cheap enough to cope.
The temptation for bankers to rush to adopt the technology is clear. However, a good rule of thumb in banking is that if you’re tempted to do something that will bring in easy money, don’t do it. And AI has trouble written all over it. In 2013, AI taught itself to play Tetris. Eventually, it learned that the easiest way to avoid losing was simply to pause the game. The potential for AI to game the financial system and destroy the economy is always there. There is also a great risk of malfunctioning algorithms. In 2012 US market maker Knight Capital lost over $400m (£261m) in 30 minutes because of a computer glitch. And when this happens, who gets the blame? In the recent Jodie Foster film Money Monster, Dominic West’s antagonist blames a glitch in the algorithm for the loss of $800 million. When there’s no-one to blame, who takes responsibility when things go wrong? The person who created the algorithm? The CEO? The traders? It is extremely doubtful that regulators will be able to respond appropriately to the changes. Few enough people on Wall Street go to jail as it is, AI will likely see that number fall even lower.
According to McKinsey, ’We see financial regulation continuing to broaden and deepen as public sentiment becomes ever less tolerant of preventable errors and inappropriate practices, or of bank failures’. AI can help banks abide by existing regulations, but equally a whole new framework is needed to rein it in. At the moment, a McKinsey survey found that just 3.5% have actively deployed AI. This number is going to shoot up rapidly, and regulators need to be on the mark to ensure the technology does not run out of control.