RiskLiquidity RiskAn Evening with gtnews: Cascading Risk

An Evening with gtnews: Cascading Risk

The cascading of risk was the subject on the agenda for ‘An Evening with gtnews’, in association with Infosys, where members of the banking community gathered to hear risk experts discuss the best methods for seeing and controlling risks within treasury staff, processes and systems – as well as the extent to which this is possible.

Management vs Measurement

Gerald Ashley, managing director of financial consultancy St Mawgan and Co, made a distinction between the management and the measurement of risk – a lot of people who say they are in risk management are actually in risk measurement. “Boy, do we live in a world that is obsessed with risk measurement – and there is not much management going on,” Ashley said. “One could make the observation that certainly since the mid-1990s never have there been so many people with ‘risk management’ on their business cards in finance and, boy, never have the losses been as large as they are.” He promised to examine whether this was a correlation or a coincidence or whether the line was in fact more blurred.

Ashley began by considering the current risk landscape and the particular difficulties it posed for treasurers trying to balance credit, profitability and liquidity – critical in the past 18 months. He pointed to crises from past decades, where the drying up of liquidity caused markets to seize up. These included the banking bust of 1975/6, when a large number of undercapitalised banks went under, causing turbulence until the early 1980s. He also invoked the 1998 Russian bond crisis, a “dry run of the credit crisis.”

The latter, he said, should have warned observers during the recent credit crisis of a major risk – unexpected correlations in the market, where nearly all the markets fell simultaneously. “…we had started to be in a model-driven world where, if all the models are saying ‘sell’, who the hell is buying? If you are long of an asset that has no value, you have to sell something else to get back inside your value-at-risk (VaR) limits,” Ashley added.

The increasing complexity of risk was a major hurdle – compared with 20 years ago, “to the extent that one begins to wonder if it is possible to understand the risks one is understanding.” Regulation was another – Ashley disagreed that “light-touch regulation” was a hallmark of the Financial Services Authority (FSA)’s current approach – and added that it was likely to become heavier in time.

Foreshadowing the theme of Milind Kolhatkar’s following presentation, Ashley outlined the main potential areas of risk: people, processes, systems and external systems. Staff, he said, were a larger factor than generally perceived: “It’s not just about computer systems and processes.”

The amount of information available to those working in the markets – and the reliability of that information, was represented by Ashley as a sliding scale between certainty, via risk, to uncertainty. He contended that the recent move to turn risks into ‘uncertainties’ was a major cause of the current crisis; instead of gathering facts to accurately gauge the level of risk, risks were extrapolated until the level of information available made the risk models unreliable and, often, meaningless. “The credit crunch, I would contend, turned risks into uncertainties,” he said. “Some of the products that were created, in terms of the credit markets, [meant] we actually divorced ourselves as between borrowers and lenders. Things were cut and sliced and diced so many times that although it looked as if we had a portfolio effect, we had created all sorts of uncertainties in these instruments.”

Inadequate Data

Examples of these uncertainties included default and recovery rates – for which, it became apparent, banks didn’t have adequate data to predict with any degree of certainty. Liquidity, another hallmark of the crisis, was very difficult to model accurately based on past information. Valuation gaps and opaque market data were also perceived as risks rather than uncertainties. “It is one thing to have a reference rate; it is another thing to find you own five or ten per cent of an issue once you trade in it,” Ashley added.

Ashley turned to the tools currently used to measure risk – and their relative levels of adoption. Standard risk tools, such as VaR, he said, were almost universal, while stress-testing was becoming widely adopted. Scenario testing, in turn, had been much more widely adopted in industries other than financial services, which could learns from sectors such as the continuous process industries, security and defence, where they had already become commonplace.

He also highlighted a less commonly-used risk tool: real options – using options pricing to price non-financial instruments, which carried a warning: finding the correct underlying price was difficult. Ashley also examined morphological analysis, a technique that develops an input and output model for all the factors that impact on the issue at hand and then develops a structure to illustrate all the real and potential links between inputs and outputs. He characterised it as “a more auditable form of scenario planning, which removes cross-dependencies – and is used widely in high-tech industries, although its efficacy is yet to be proved in financial services.”

Ashley concluded by highlighting the difficulties of scenario planning where there was not enough information to form a meaningful analysis. This often resulted in a ‘best-guess’ outcome with a degree of latitude on either side. This however, was not sufficient, he said, “With scenarios, of course, you are looking at different timings and a whole range of outcomes and trying to get away from this bell curve trap,” he said. “We are not trying to fit things into a model all the time. We need have a more agile, looser way of thinking.”

Milind Kolhatkar, head of Finacle Treasury Solutions at Infosys, took up the theme of how treasurers could deal with the uncertainty outlined by Ashley. First, he said, uncertainty needed to be tracked, even where it could not be measured. Second, were the techniques to be used for day-to-day risk management or more actively, to deal with sudden shocks to the business? Third, should systems and processes take charge of risk management, or was a common sense approach more effective?

Kolhatkar represented the hierarchy of risk as a pyramid with the most certain risk factor – systems – at the bottom, followed by processes, people and with external events – the least predictable – at the top.

Companies, he pointed out, needed to be sure that their risk management systems could withstand business interruption. “If there is a problem like 9/11, is your system able to support the back-up of the data? Is it ready for business interruption?” he asked. “There could be occasions where there are interruptions for a few hours or a few minutes where you are not able to get, for example, data feeds from markets.” Models, too, needed to be proven and sophisticated enough to price risk accurately.

Processes – such as collateral and documentation follow-up, which, in turn posed operational risk, could also be addressed by technology. Kolhatkar pointed to the importance of creating predefined processes reflected in the software and the technology used. People, he said, were less predictable, and human error posed the greatest reputation risk.

External Crises

But it was external events that posed the greatest threat to companies – and they are becoming more common.

“These kinds of crises seem to happen more frequently than we would have thought, so something that used to happen every 50 years is now happening every three years,” he said. “These kinds of big market shocks are here to stay.” He added that although risk management measures, in the form of software, systems and processes, were able to take care of measurable factors such as a company’s P01 and VaR, things which we are used to measuring, but that it was proven that these measures were not enough for taking care of systemic breakdowns when they happened. “[Specific measures] may not be able to measure exactly what is going to happen, but it gives us some level to which we will get back,” he said. “If, today, we just look at the existing measures, without preparing for extreme measures, then it will just keep on repeating itself.”

Kolhatkar said that total visibility was crucial, even if imperfectly measured. One risk, for example where the same person had control over both front- and back-office processes, could prove fatal to an enterprise, even where all other risks were being monitored. “All the risks should be on the horizon for the risk management person should be able to see as many risks as possible. The visibility has to be there for all the risks. You may not be able to measure them perfectly but visibility is very important because that one single risk, if ignored, can kill you,” he warned.

Taking Responsibility for Risk

The question and answer session proved a valuable opportunity for the audience to participate. The first question, posed by gtnews’ chief executive Mike Hewitt, concerned where the responsibility for risk lay: with specialised risk professionals, or treasury professionals within either banks or corporates. Ashley responded that the sector itself was not the most crucial thing – the need was for the audience interpreting the risk models to understand the models limitations. “I think the problem may not be so much in your terms but in terms of whether the creators of risk outputs and the audiences who then make management use of them misunderstand one another.”

And what should banks take away from the evening’s presentation, in terms of specific steps that would safeguard their companies? Ashley responded that, while he expected scenario planning to become much more mainstream over the next few years, due to the increasingly complex marketplace, there was a danger that, in trying to formalise the structures, it would result in ‘taking an agile tool and making it more rigid.”

Another question addressed the tension between creating policies to measure risk and approaches to take care of it directly – to which Ashley replied that risk modelling could not act as a panacea for risk. However, he added that there was no perfect solution: “I am criticising modelling maybe having gone too far and I am now contradicting myself by saying that the future probably looks towards yet more complex solutions. It is not an easy way forward.”

And finally, should there be external certification of risk management processes? Kolhatkar thought that this would smack of micro-management on the part of the regulators. “An individual institution decides how it wants to run its business, whether it wants to take high risk and put more capital at risk or it wants to play the low capital/low risk business.”

So, perfect solutions are still a long way off for determining the best way of identifying, predicting and minimising risk. However, there are practical steps that treasurers in banks and corporates can take today that will help them to gain visibility over the risk management process. By establishing who has responsibility for risk management, applying the right technology and carrying out a variety of sophisticated risk measurement and management strategies, organisations can ensure that they are prepared for future events.

Gallery from ‘An Evening with gtnews’, 20 May 2010

Comments are closed.

Subscribe to get your daily business insights

Whitepapers & Resources

2021 Transaction Banking Services Survey
Banking

2021 Transaction Banking Services Survey

2y
CGI Transaction Banking Survey 2020

CGI Transaction Banking Survey 2020

4y
TIS Sanction Screening Survey Report
Payments

TIS Sanction Screening Survey Report

5y
Enhancing your strategic position: Digitalization in Treasury
Payments

Enhancing your strategic position: Digitalization in Treasury

5y
Netting: An Immersive Guide to Global Reconciliation

Netting: An Immersive Guide to Global Reconciliation

5y