RiskCredit RiskThe Data Behind True Risk Management

The Data Behind True Risk Management

Reliable financial data that is accurate, consistent and accessible should go quietly unnoticed in the noise of day-to-day operations. But when the right data is absent, inadequate, difficult to understand or simply wrong, the pain is felt throughout the organisation every minute of each day.

In an increasingly competitive and global marketplace, financial institutions need dependable data to help improve time-to-market and productivity, reduce risk and exposure, and ease the compliance burden. Yet many financial institutions are relentlessly challenged by disparate, duplicate or inadequate information flowing throughout their organisation, making it difficult to capture the data needed to make sound business decisions.

Financial institutions must prepare their data infrastructures for new risk management challenges and the impact on financial institutions’ ability to manage evolving compliance needs, reduce their operational risk and free up regulatory capital. As the effects of the credit crisis continue to take shape, there is demand for greater transparency of risk, as well as the information that drives it.

Bucketed risk approaches have their limitations. Integrating risk over types of assets as well as type of risk – credit, operational, liquidity and market risk – and the convergence between risk, finance and management accounting functions are needed for a true view of exposures and soundness of procedures. Only the right information driven by accurate and reliable data across the front, middle and back offices can achieve true risk management.

The Effects of Poor Data

The impact of inaccurate, incomplete or belated information is typically vastly underestimated. To fully appreciate the magnitude of the issue, ask yourself a few questions:

  • How accessible is the information necessary for making decisions?
  • Are new business initiatives delayed due to a fundamental lack of trust in your data?
  • Is there consistency in information between various stages of transaction processing?
  • Will regulators and customers get different answers to the same question from different parts of your company?
  • Do you know the repercussions of an error in your revaluation prices, corporate actions, security master, and legal entity data?
  • Can you always determine where your information came from?
  • Can you track the movement of data from one system to another?

The consequences of poor and incomplete data can be dire. Faulty data can result in breaches of limits and confusion as to exposures on counterparties. Poor prices can lead to mis-priced products, faulty revaluation of customers’ or internal portfolios, and bad trading decisions – not to mention regulatory troubles. Indeed, a financial institution’s data can be a major source of operational risk. Lack of transparency in pricing and structuring of financial products has caused havoc around the world.

The old banking adage of’know your customer’ broke down in securitisation of subprime mortgages, as banks no longer had an incentive for the loans to perform once they were off the balance sheet. Counterparty risk is often notoriously difficult to fathom. The first order effects are still relatively easy to chart (once cross-reference and legal structure data issues have been sorted out), but indirect exposures through other parties can be much higher and more difficult to fully understand.

For many complex structured products, the potential number of terms and conditions to track, model and administer runs into the many thousands. Adding business rules to monitor, such as the composition of a portfolio, often means a singularly hefty integration challenge.

The Limitations of Traditional Risk Management

Common elements of financial crises are over-eager lending, careless investing and a widespread failure of risk management. It is almost a natural law that in general, by and large, risk will end up where it is least understood, and no amount of regulation, product innovation or investor protection directives will ever change that. This means that an incomplete understanding of the terms and conditions associated with items in the portfolio puts institutions at a huge disadvantage. Furthermore, the way risk management historically operated has not helped.

Traditionally, risk as a business function has been distributed over an organisation, in a similar fashion to which products and services have been divided over different business lines. Banks organised their sales and trading businesses in, for instance, equity, short-term interest rate, credit and commodity desks with corresponding separate reporting lines for market and credit risk, asset & liability management (ALM) and operational risk. Often these different reporting lines only came together in a risk function at group level.

The use of derivatives has forged much closer ties to formerly more independent markets, and similarly the distinction between market, credit and operational risk is not so clear-cut. Many structured products straddled the boundaries of separate risk drivers and blended credit, commodity, interest rate, foreign exchange and equity risk.

How Far Can VaR Go?

To date, Value at Risk (VaR) methodologies have played a large role in producing risk numbers. However, VaR models, whether using Monte Carlo or historical simulation, are useless at predicting catastrophes. In fact, they may even have reverse effects since a rise in volatility causes higher VaR numbers, which could lead to limit breaks and forced selling.

VaR also tends to provide metrics on’business as usual’ and cannot show the magnitude of potential downturn: tail risk insight is needed yet VaR cannot provide it. Number myopia can be deadly: all the dashboards and indicators in the world are worthless when the underlying assumptions (such as correlation) on which they were based no longer hold. Terabytes of tick-by-tick data can be collected on exchange rates but cannot alone help manage risk – a qualitative view is also needed.

New products often lack historical data and also need to be understood by linking to the factors that drive their valuation – whether interest rate, currency, macro, commodity, credit spread, event or correlation driven – as well as cross-links between these drivers. This can be enabled by having a sound and transparent data infrastructure, which in turn should be flexible in accommodating new products, while preventing new products from being priced and re-valued through only spreadsheets. The terms and conditions of any product will undergo much more scrutiny by more stakeholders in a financial institution, as more formerly separate business areas require the same information.

The Power of Holistic Views of Risk

VaR methods and scenario analysis aside, there are inherent limits on predictability and control. Any business has to decide which risks are part of the business and which are not. The purpose of risk management is to increase the value of the business, and one of the key purposes of data management is the enabling of risk management.

Enterprise risk and a holistic view both over asset classes (credit, interest rate, foreign exchange, commodity, and equity) as well as over risk types (market, credit, operational, and liquidity) needs to become a reality and replace the current bucketed approach. The integration of the various traditionally separate risk buckets will not stop at an enterprise risk view. The traditional separate functions of risk management, finance and management accounting will move closer together.

One of the focal points of Basel II was to introduce risk metrics closer to the business risk and also to integrate operational risk into the framework. Issues around clearing and settlement of derivatives, such as credit default swaps, can impact market, credit and operational risks. A convergence of these historically differently managed disciplines puts additional demands on data management and system integration.

It is often said that information is power. The reverse is also true: inaccurate and incomplete information leaves one powerless against the onslaught of other financial markets participants. Clean and consistent data is critical across the front, middle and back office. It is needed for the fair-value pricing of mutual funds, bidding for (distressed) securities, collateral management, the daily revaluation of investment portfolios, simulations and scenario analysis, as well as for market and credit risk reporting, market conformity and compliance checking, client reporting and for decision making generally.

Comments are closed.

Subscribe to get your daily business insights

Whitepapers & Resources

2021 Transaction Banking Services Survey
Banking

2021 Transaction Banking Services Survey

2y
CGI Transaction Banking Survey 2020

CGI Transaction Banking Survey 2020

4y
TIS Sanction Screening Survey Report
Payments

TIS Sanction Screening Survey Report

5y
Enhancing your strategic position: Digitalization in Treasury
Payments

Enhancing your strategic position: Digitalization in Treasury

5y
Netting: An Immersive Guide to Global Reconciliation

Netting: An Immersive Guide to Global Reconciliation

5y