The potential for data analytics is being realized across the financial sector. According to the latest Worldwide Semiannual Big Data and Analytics Spending Guide from IDC, worldwide revenues for big data and business analytics (BDA) will go up from US$130.1 billion in 2016 to more than US$203 billion in 2020. And it is banking that it is leading the charge, with IDC estimating that the industry spent almost $17 billion on big data and business analytics solutions in 2016.
At the forefront of this is Citi. As the global lead of Citi’s Global Data Strategy and Regulatory Risk Reporting, Roberto Ramirez Pinson deploys a Global Big Data strategy roadmaps across 16 countries in APAC and EMEA in order to comply with Federal Reserve (US) reporting mandate and reduce manual reporting with a global 100 FTE reduction. We sat down with him to draw on this experience to gain a better understanding of how the banking industry was using big data. Roberto will also be presenting at the Chief Data Officer Summit, which takes place in Singapore this July 4-5.
Can you give some examples of how big data is changing Citi?
Big data is a powerful component/capability that, if exploded correctly, can revolutionize businesses, delivering results around the 360-degree spectrum. Whether it is being used for Sales and Distribution, Marketing, Operations, Finance and Reporting, big data allows for a more scientific and automated approach, delivering superior results to all stakeholders, internal (Sales and Marketing, Finance and other internal departments) and external (clients, customers, regulators, etc.)
The banking industry has gone through a dramatic period of change that kicked off with the Credit Crisis. That period has led to a push for banks to become far more scientific when it comes to executing strategies, and for that reason Citi has been reshaping itself to focus on specific areas of its business and geographies in order to optimize returns.
Some examples refer to Sales and Distribution, matching the right product to the right segment, (client relationship management overview), Decision Management, supporting the decision making process with the right and accurate data, whether it is to grow or wind down specific portfolios, or automating reporting to regulators, eliminating manual work and errors that can lead to heavier impairments.
What challenges have you faced when implementing new data strategies?
Being a global bank brings global challenges to the picture. Not all geographies share the same technologies, and different regulation applies to different locations, nevertheless, as any international bank with multiple operations around the world, we are obliged to comply with the regulation in the market where we are listed.
On top of that, there are soft factors that come into play when implementing massive changes, such as cultural differences, large structures, ambiguous organization, and responsibility allocations.
From my personal point of view based on experience, there are 2 critical areas that can make or break a Data Strategy implementation, and any major implementation in general. On one side, there has to be a clear picture of the objective and the short, medium, and long-term benefits. Any project may respond to an immediate request, but it should also be outlined looking at the long term picture. This will allow for optimization and right resource leverage. On the other hand, there needs to be a clear picture when it comes to roles and responsibilities. There are so many initiatives that suffer overwhelming failure given all the elements that fall through the cracks and all the grey areas.
How do you see the use of big data in the banking industry changing in the next five years?
It is no secret that information is power, but not all information is useful. We live in a world that we are constantly bombarded with more information at volumes that keep growing, and we need to be able to discriminate the good information from the not so useful data. Big data has been evolving more and more as structures become more complex. Big organizations need to have three things: good data, good engines, and good representations of data to make better and faster decisions. I believe that the immediate future will focus on high-quality data, followed by stronger efforts to develop more sophisticated tools for data consumption. This will lead to shorter decision making lead times for strategy development and execution. This means that indicators that were traditionally reviewed at the end of each operating period will now be updated more frequently, allowing for faster decisions with immediate effects.
What advice would you like to give to companies undertaking a data transformation journey?
Think before you do. Do not jump the gun by getting started before there is a thorough effort to understand what is the plan, what is the overall strategy of your organization, and how do you expect data to support and enable that strategy. Over my 20 year career, I have seen so many initiatives go down in flames as a result of a lack of end-to-end purpose. Data is very powerful, but it can also turn into a headache. Identify critical data elements that can be standardized, identify the uses for all and every one of the elements, and, more importantly, build a catalogue of values according to its uses. Select one and stick to it. No moving targets are allowed in these efforts.
What can the audience expect to take away from your presentation?
I am very excited to be part of this event. I have over 20 years of working experiences spread across 8 countries and 3 different industries. My intention is to share some of the learning that I have picked up, especially when delivering efforts that are critical for the transformation of the business.
You can hear more from Roberto, as well as a host of other industry experts, at the Chief Data Officer Summit. View the full agenda here.
BONUS CONTENT: David Gledhill, Group CIO, Head of Operations, DBS Bank, discusses how the traditional banking industry leverage technology to keep up with the digital revolution