FinTechBig DataLeveraging Big Data for regulatory projects

Leveraging Big Data for regulatory projects

This article explores the feasibility of using Big Data technology in the areas of banking and financial regulation. It examines the challenges involved and outlines the way forward.

Dodd-Frank, Basel III and Comprehensive Capital Analysis and Review (CCAR) are among the terms with which every bank employee is familiar. Since the 2008-09 financial crisis, regulatory authorities have devised a set of stringent rules to ensure that banks have adequate capital during times of financial stress.

Banks and other financial institutions such as insurers are spending huge amounts of money on regulatory compliance and adding more staff in order to meet regulatory requirements. The Financial Times reported that in 2013 JP Morgan alone added 4,000 employees to their compliance team and spent an additional US$1bn on controls. Citigroup reported that greater efficiencies saved US$3.4bn in a year, but 59% of that figure was then consumed by new compliance spending. HSBC expanded its compliance department from 2,000 to almost 7,000 personnel.

These statistics evidence that meeting regulatory requirements is putting great pressure on the financial services sector in terms of costs involved and strained resources. It is impacting banks’ ability to carry out non-regulatory projects, which could improve customer service, increase operational efficiency and improve customer response.

Banks are looking at newer methods and technologies to help rein in costs while ensuring compliance with the latest regulatory requirements. Big Data is among the latest technologies, touted as a go-to solution to help banks comply with regulatory requirements.

So what is Big Data? Does it live up to the hype surrounding it? Will it help banks meet regulatory challenges and lower costs? These questions require a little more analysis

Similarity between Big Data and regulatory data

There is considerable similarity in the data that financial institutions generate and what is described as “Big Data”.

Figure 1: Similarity between Big Data and regulatory data:
Big Data vs Regulatory Data
Big Data

Big Data deals with very large data sets which are complex to extract, store and utilise. There are multiple applications in the field of sports, medicine, retail and of course banking and finance. Big Data is viewed as the “holy grail” that can help solve many business problems and detect patterns in data .This could boost sales, select the right investment portfolio, and help in areas of health care among many potential applications.

The characteristics of Big Data include:

Volume: Referring to the quantity and volume of data generated. The size of the data determines the value and potential insight – and whether it can actually be considered Big Data.
Variety: Refers to the differences in the type and nature of data.
Velocity: The speed at which the data is generated and processed.
Variability: Inconsistency of the data set.
Veracity: The quality of captured data can vary greatly, affecting accurate analysis.
Complexity: Managing data coming from multiple sources can be challenging. Data must be linked, connected, and correlated so users can query and process it effectively.

Regulatory data

CCAR, Dodd-Frank and other regulatory frameworks revolve around data, data and more data. The basic characteristics of data required for regulatory projects and the similarities between regulatory data and Big Data are as follows:

1. First and foremost financial and risk data must be accurate, in keeping with the principle of veracity.
2. Millions of financial transactions take place daily and transactional data is generated every day and every hour. Hence the velocity at which data is generated is huge.
3. Different types of data are generated for loans, credit cards, mortgages, options and many other financial instruments, which refers to variety in data.
4. For global systemically important banks (GSIBs) and domestic systemically important banks (DSIBs) a huge volume of data is generated on a weekly, monthly and yearly basis indicates.
5. Data is in multiple systems and spread across the globe, thereby increasing the complexity of data stored.

Despite these similarities with regulatory data, Big Data is not the go-to technology for banks’ and financial Institutions’ regulatory projects and there are multiple reasons for this.

Challenges of using Big Data for regulatory projects

A recent Capgemini study found that eight in 10 of the organisations polled had Big Data projects underway, yet of these only 27% described the project as ‘successful’ and 8% as ‘very successful’. With this statistic in mind what are the barriers to using Big Data for regulatory projects?

1. Legacy systems: A regulatory project for a GSIB involves sourcing data from multiple countries for multiple products stored in multiple systems. Many of these are legacy systems, from which it is difficult to extract data and which may not be best suited for Big Data technologies. Indeed, this is the number one factor which prevents use of Big Data technology.
2. Data in silos: There is no uniform view of data and most organisations have not integrated disparate data sources. This prevents them from quickly implementing cutting edge data technologies, as much time is spent in understanding the data, the way it is stored etc. Various divisions within the organisation have to work together in order to come up with a comprehensive view of the data- this is particularly important for meeting regulatory requirements.
3. Poor co-ordination: For a Big Data project to succeed, the regulatory data made available requires inputs from risk and finance officers. They need to identify it and perform calculations on it. The technology team needs to work closely with the risk and finance teams to further understand the data, set up a process for data cleansing and storage and then work towards a Big Data architecture. With tight deadlines and pressure on adhering to the regulatory timetable, communication between the teams breaks down – which is not conducive to implementing new technology solutions.
4. Management buy-in: Successfully leveraging Big Data for regulatory projects requires a strong buy-in from top management, which often invests in Big Data solutions in order not to be left behind rather than having a clear vision on how they will contribute to the strategic goals. They may also get overwhelmed with the technology, rather than thinking in terms of business solutions.

Is Big Data feasible for regulatory projects?

The challenges outline above leads many banks to question the use of Big Data technology for regulatory projects. So what are the advantages?

Reduced costs:
Data storage costs are reducing. Consider Hadoop the poster child of Big Data technology, which allows the storage of as much data as required in any form simply by adding more servers to a cluster. This makes data storage far less costly than older data storage methods.

Breaking down of silos:
Data stored in silos is a major challenge in regulatory projects. Big Data technologies allow all data to be stored, making access to historical data and transactional data generated on a daily basis easier for both internal teams and regulators.
Take the example of pre provision of net revenue (PPNR) and the data complexities involved in its calculation. With emergence of stress testing, PPNR modeling has gained great significance. It is seen as a core part of stress testing process and helpful in financial planning. Getting the PPNR calculation right is a significant step in ensuring robust risk assessment.
However among the biggest challenges in calculating PPNR is data availability. The process is complex and requires 10 years of fees and volume data in order to capture the economic and business cycles. Big Data is an excellent technology to use in such circumstances, as huge volumes and variety of data can be stored at a granular level.

Comprehensive view:
A further advantage of Big Data is that it provides a comprehensive view of all financial transactions. In the current global economy a Chinese company can borrow from a European bank and invest in Africa. Regulators want to understand each step of the transaction. Big Data helps provide a comprehensive view of the data transformation and ensures a high level of transparency by helping track all the transactions.

Speed and accuracy:
Emerging Big Data technologies help retrieve data quickly and accurately. By automating extraction and storage, manual intervention is reduced with fewer mistakes and inaccuracies. It becomes easier for the bank to provide the right data in a timely manner to regulators and immediately raise a red flag in case of problems.

Conclusion

The emerging regulatory landscape puts much pressure on banks and financial institutions in terms of costs and time spent on fulfilling regulatory requirements. Big Data’s powerful analytical capabilities and ability to store and manage large amounts of data offers a strong tool to ensure that banks remain compliant with the latest regulatory framework. However, success is dependent on a buy-in at the top level and a good risk culture across the organisation.

Comments are closed.

Subscribe to get your daily business insights

Whitepapers & Resources

2021 Transaction Banking Services Survey
Banking

2021 Transaction Banking Services Survey

2y
CGI Transaction Banking Survey 2020

CGI Transaction Banking Survey 2020

4y
TIS Sanction Screening Survey Report
Payments

TIS Sanction Screening Survey Report

5y
Enhancing your strategic position: Digitalization in Treasury
Payments

Enhancing your strategic position: Digitalization in Treasury

5y
Netting: An Immersive Guide to Global Reconciliation

Netting: An Immersive Guide to Global Reconciliation

5y