Risk: The New Concerns
Insufficient data consistency and the inability to aggregate across risk types is affecting financial institutions’ ability to manage risk, according to a risk priority seminar by think-tank JWG, which surveyed 30,000 financial professionals. The issue of funds transfer pricing (FTP), stress testing and the need for detailed capital and liquidity ratios were hotly debated under Chatham House Rules at the event, in conjunction with the Financial Services Authority (FSA) at which panellists included bank trading and treasury representatives.
The other areas currently of concern include aligning business and regulatory objectives, the best way correctly to incentivise good risk management – and how to make this part of corporate culture.
The discussion showed that banking leaders have started to tackle the big issues, although securing the resources to put these measures in place for 2011 is proving difficult. This is partly being driven by the fear of greater penalties emerging over the next few years for inadequate risk infrastructures. The involvement of the board with risk management was a key theme, and panellists reported seeing it playing a crucial part in pushing through changes, with the reporting flow increasing to daily.
In terms of creating a risk strategy, a ‘90% effective’ risk strategy was regarded as no longer close to sufficient – but panellists emphasised the fact that the goalposts had moved to such an extent that choosing a risk strategy was fraught with difficulty. The point was made that there were no ‘quick wins’ in terms of reassessing strategy but that whole company structures and business processes would have to be re-evaluated. This re-evaluation should include:
The existence of silos within different business functions was also of concern – it had become clear that business functions within the same organisations were identifying similar weaknesses in their risk management but that a failure to share data was restricting progress in tackling these.
Being able easily to locate the relevant information on the balance sheet was as important as gathering the data in the first place. One speaker characterised the answer as being a ‘top-down view’ with a ‘bottom-up solution’, explaining that the drive for more information was, in his experience, actually impeding clarity rather than the reverse. This was leading to a loss of granularity and information being buried on, for example, cash flow. The right level of granularity was achieved at the point when have the minimum amount of information needed to answer the key questions.
However, one panellist explained that the added complexity of information in transaction banking compared with retail banking meant that the information flow was considerably slower – and that adding extra avenues for data flow was the only way around this – but that it was a cumbersome way of addressing problems with legacy systems. This lack of straight-through processing (STP) was seen as the key obstacle to streamlining the reporting cycle and in turn delivering the most accurate information to the board.