Cash & Liquidity ManagementInvestment & FundingInvestment ManagementManaging Risk and Liquidity During the Credit Crunch

Managing Risk and Liquidity During the Credit Crunch

A sub-prime mortgage is a type of mortgage that is normally provided to borrowers with lower credit ratings. As a result of the borrower’s lowered credit rating, a conventional mortgage is not offered because the lender views the borrower as having a larger-than-average risk of defaulting on the loan. Lending institutions often charge interest on sub-prime mortgages at a rate higher than a conventional mortgage in order to compensate them for carrying more risk.

In the recent past, sub-prime lenders increased their exposure with a strong conviction that they had a system to mitigate their risk while still making a profit. After they got the borrower to sign on the dotted line, the lending institutions packaged up the loans and sold them to hedge funds, mutual funds and private equity groups looking for quick returns.

To make matters worse, many of the buyers of these packaged loans used borrowed funds, thus adding another layer of debt to a shaky foundation. The shares of these companies were then bought by pension funds and insurance companies looking for high returns, even though they would never have bought the risky mortgages outright.

This layer upon layer of risky debt was created to fuel short-term profits on a shaky foundation (i.e. sub-prime mortgages) and when the borrowers at the foundation of this structure started defaulting the entire structure started collapsing leading to the ‘sub-prime crisis’.

At present, the global banking system is facing a major credit and liquidity crisis. Losses from sub-prime mortgages are creating a credit crunch that may trigger a global slowdown. Over the last year, major financial institutions have written off nearly US$300bn, and central banks around the world have initiated emergency measures to restore liquidity.

Flaws in Risk Assessment Systems During the Crisis

Excessive reliance on mathematical models

Fundamental analysis suggests that the deficiencies in the risk management process appear to stem from de-emphasising the common sense aspects of risk management and over-reliance on mathematical risk management tools.

Mathematical tools such as parametric models can sometimes find risks by using different quantitative techniques that might otherwise not be readily apparent. However, there are also qualitative or judgmental aspects of risk that these models fail to spot. It can be a grave mistake to conclude that risks are absent just because mathematical tools have not detected or revealed them. Some kinds of risk lie beyond the reach of mathematical tools and can only be uncovered by other intuitive means.

Inadequate risk assessment of complex structured products

One more fact that has been highlighted is the deficiencies of financial systems’ ability to view, trade, process and analyse complex financial products. For example, synthetic collateralised debt obligations, which are highly structured instruments (derivatives of derivatives of derivatives), have structural complexity that is often intractable without making simplified assumptions. The problem here for risk management practitioners is that it may not be practical or possible to determine whether the simplifying assumptions are reasonable or whether they circumvent important risks.

Inadequate modeling of funding liquidity and asset liquidity risk

Sudden liquidity evaporation due to the sub-prime crisis has led to an unexpected demand for liquidity to fund the sudden depletion in liabilities, such as cash calls and deposit withdrawals (funding liquidity risk). This has resulted in selling assets at an unanticipated low price to meet sudden liquidity demands (asset liquidity risk). Most practitioners were not equipped with dynamic liquidity modeling to safeguard against these risks.

De-emphasis on fundamental credit risk analysis

The rising focus on complex derivatives and structured financing brought ‘financial engineering’ practice to the centre stage with a strong conviction that financial engineering can address all the fundamental risks. This, however, sidelined the core basic ‘credit appraisal’ skills focusing on qualitative (judgmental) inputs based on leading indicators and behavioral assessment.

Lack of data infrastructure providing single view of data

Behavioral assessment of credit analysis is a data intensive process. For example, delinquency assessment needs data at the granular level of a transaction or a position for all credit sensitive exposures across the bank’s business units, which would be a staggering volume for a large bank.

Although banks have historically captured ratings performance data, including the default, very few today have data on all three key risk indicators (i.e. PD, LGD and EAD). Even if the data is collected, the bigger question that needs to be answered is: is it of high quality and in a ready-to-use format? Data complexity is further compounded by the multiplicity of product systems and the business lines that captures and stores data in local formats.

Silo model prototypes used for risk assessment

Silo models developed for risk assessment were only meant to be used as prototypes to the extent of back testing and applicability of the underlying theory to local microeconomic environment. Most of the practitioners still use such models for actual risk assessment as a part of day-to-day operations. This exposes them to operational risks, which always takes centre stage in catastrophic crisis events. The

limitations of excel model prototypes are as follows:
  • It is driven manually (no automation) for input/extraction of data, changes in parameters and to add products.
  • As an Excel model, there is inadequate stability for large data simulations, and analytics security and scalability is a major issue.
  • Performance issues, such as substantial time taken to produce the results after running the simulations.
  • No audit trail to track changes.

Emerging Best Practice

Actively engage in horizon scanning

Horizon scanning is the art and science of detecting and assessing ‘weak signals’ emanating from the internal and external environment so as to forewarn policymakers and other stakeholders about approaching ‘shocks’ and other easy-to-miss trends and ostensibly distant events. The overall goal of the horizon scanning technique is to improve the capability to anticipate and prepare for new risks and opportunities. The scanning techniques enable practitioners to make intelligent decisions, and to develop policy and strategy that is ‘forward looking’ and encompasses the issues and trends of the wider environment.

These techniques can be used in policy formulation, strategic analysis, strategy formulation and implementation to achieve robust outcomes. Techniques such as back casting and reverse engineering of preferred scenarios can help to understand what policy interventions need to be put in place to achieve the desired results, and how those already in place could be affected by future events. Horizon scanning and futures techniques help to:

  • Enhance their ability to ‘think outside the box’ and assess the ‘external environment’.
  • Anticipate and prepare for new risks and opportunities.
  • Inform future policies and strategies by a creating a better understanding of potential future contexts or stresses.
  • Future-proof policies and strategies by testing them against a range of possible futures.
  • Highlight low probability/high risk events.
  • Look at today’s periphery to spot tomorrow’s centre.
  • Build awareness and appreciation.
Supplement mathematical models by intuitive assessment

The knowledge that seemed to emerge from recent experience has the strong flavor of ‘plain old common sense’ in risk assessment. Accordingly, risk management professionals must arm themselves not only with the best possible mathematical tools, but also with methods for uncovering risks that evade discovery by mathematical means.

Improve transparency of risk exposure

Use ‘non-parametric’ models, such as historical simulations choosing data windows representing extreme volatilities, Monte Carlo simulations coupled with stress testing to understand the impact of potential outcomes and implications for highly structured product portfolios.

Develop a framework with the ability to look into the details and underlying structures of these new complex financial instruments. Interaction with business process networks, rating agencies, risk bureaus and valuation services can all be relatively cost-effective solutions in seeking a better understanding of the underlying structures. Periodic updates on the underlying assumptions behind the customised risk models are essential in safeguarding against risk. Prioritise the establishment of an infrastructure to obtain timely information and take corrective actions.

Improve transparency on liquidity position

Focus on modeling dynamic liquidity using scenario-based analytical tools to forecast the extreme possibility (liquidity crunch in the system) well in advance to safeguard against the huge cost associated with fire fighting at the last moment. Focus on modeling the behavior of all of the major balance sheet constituents (i.e. assets and liabilities) against a tradition methodologies focusing only on the volatile component of demand and time liabilities to optimise the ‘cost of liquidity’.

Give an equal emphasis to qualitative parameters for risk assessment

Build an early warning framework on a configurable rules engine to respond swiftly to the changing magnitude and frequency of credit events. This framework should incorporate different qualitative parameters and judgmental inputs. This framework should include configurable expression and trigger builders with configurable threshold definitions.

Build a robust data integration layer for risk assessment and decision making

Any risk management application implementation in a medium to large bank is a complex amalgamation of many different projects and initiatives across the organisation. One of the largest components is the aggregation of finance and risk information from the various lines of business of the bank across regions to quantify the risk numbers at organisational level. Data capture processes and integrity checks are required to ensure the availability, quality, standardisation and integrity of data going into the risk assessment and decision making. Building the data integration layer will include the following:

  • Locate the relevant data across various source systems based on the ‘data requirement’ defined as per the ‘logical data model’ with different definitions.
  • Devise a method to source the data from disparate source systems.
  • Integrate the sourced data that varies in structure and standard into one universal data repository or data warehouse (DWH) with a single business definition by extracting it into a staging area, followed by data work around transformation.
  • Generate clean and correct data maintaining desired quality and reliability standards to be fed to the risk calculation engine for reporting and analysis requirements.
  • Facilitate the ease of storage and accurate retrieval of data to enable accurate reporting and analysis.
Convert silo model prototypes into application

Due to the inherent limitations of silo model prototypes, as mentioned above, they cannot be used as applications serving end users. Hence, to platform these models on a risk management framework, organisations should use their captive information technology unit to re-engineer the model into an application and platform it or outsource this activity to firms providing IT enabled services specialising in the risk domain.

Apart from complying with regulatory requirements, the end application provides key business benefits, such as adequate data segmentation, better availability, and secured connectivity across different locations, significantly enhanced performance standards, better application administration and data security.

Conclusion

The best practice outlined in this article will lead to a dramatic change in the approach to risk assessment and ultimately result in a proactive approach to risk mitigation. These best practices constitute a judicious blend of business process improvement and a technology enabled framework. These methods will help companies to build a resilient and sustainable risk-informed decision-making framework to enhance competitive advantage.

Comments are closed.

Subscribe to get your daily business insights

Whitepapers & Resources

2021 Transaction Banking Services Survey
Banking

2021 Transaction Banking Services Survey

2y
CGI Transaction Banking Survey 2020

CGI Transaction Banking Survey 2020

4y
TIS Sanction Screening Survey Report
Payments

TIS Sanction Screening Survey Report

5y
Enhancing your strategic position: Digitalization in Treasury
Payments

Enhancing your strategic position: Digitalization in Treasury

5y
Netting: An Immersive Guide to Global Reconciliation

Netting: An Immersive Guide to Global Reconciliation

5y