FinTechSystemsOvercoming Data Complexity

Overcoming Data Complexity

Increasing volumes of information combined with the ever-present blur between what is deliverable and what is still in development continues to cause a certain amount of head scratching among enterprise data managers. This tends to focus on the best way forward to ensure that their data quality not only generates confidence among users but also makes it easier for them to develop strategies that will net their companies significant profits while others are left behind. Within the framework of tighter regulation this is a tough challenge.

Data Quality

Asset managers rely on accurate and accessible reference data in order to execute their trades, value their portfolios, manage risk and comply with a burgeoning number of regulatory demands. The formation of industry groups to pool experience on enterprise data management marks the maturity of the concept, even if some are still struggling with implementation.

More than this, certain high fliers within the securities industry not only recognise the intrinsic value proposition which data offers, they are actively seeking more sophisticated ways of turning data analysis to their advantage. In order to do this they must know that the data with which they are being presented is of the highest quality – clean, consistent and current. This confidence enables fund managers and traders throughout the enterprise to make far more refined and sizeable decisions than ever before, and encourages an environment where high levels of data curiosity, and the attendant profits from the ‘golden nuggets’ thus revealed, are the norm. This has to be achieved within the climate of cost control.

Climate of Cost Control

A study earlier this year by the US research and advisory firm AITE showed that, since 2001, the financial services industry has generated profit primarily through cost cutting and questions whether this can continue without serious damage to the basic operational infrastructure. At the same time, data outsourcing opportunities are presenting themselves. For example, managed data services and technology can provide a service for consolidating, validating and enhancing published reference data. These services can include data management, information technology, vendor management and performance measurement services.

In the financial services industry as a whole, IT spending remains conservative, and it is only those projects with creative thinking plus solid ROIs that are being funded. In this environment the perceived strategic value of the outsourcing option can be compelling, combined with the ability to retain proprietary data. Hybrid data management solutions are therefore becoming increasingly attractive for many of the world’s asset managers. While people may not be enthusiastic about being pioneers, they are open to the potential benefits according to an A Team study in May 2005.

Total Data Management Approach

The benefits of a total data management approach are almost self-evident. Centralised and standardised data supports risk, STP and front-to-back-office data integration. Additionally data management can be allocated more efficiently and cost-effectively, with the commoditised data handled by the outsourcing supplier and the strategic data remaining within the firm. Potential economies of scale are another bonus.

The debate over outsourcing, particularly in key aspects of the business, will continue. However, third party IT management in particular has reached such a level of maturity that the issues are all known and the options well explored. While the relationship is fundamentally one of trust, contracts are now robust, there are many reference points on outsourcing and participants can quickly become well versed in their roles and responsibilities.

Reference data comprises 40 per cent of the information involved in every trade and includes the identity, origin, ownership, pricing history and legal terms and conditions of all securities. The emphasis to date has been on the collection, validation, normalisation and consolidation of this data from multiple feeds into cleansed composite ‘golden copy’ sets for business use. The availability of the outsourcing option on a global basis is an important step forward, allowing data managers to concentrate on supporting high value activities while managing risk and making the best use of available resources.

Reference Data – Progress on Standardisation

There is probably no parallel to the IT industry in terms of dichotomy between providers and users, and reference data is a shining example. On the one hand the data providers have a built-in commercial interest in preserving their differentiators. On the other hand, hit with increasing volumes of data and compliance requirements, their users would at least find their pressures decreasing if any level of standardisation were achieved. That pressure continues to build.

Juergen Stahl, Head of Business Support Investment, at Germany’s Union Investment Group, points out, rather than imposing restrictions on investment vehicles used in ‘classical’ funds to ensure the quality of the product, the regulators now impose requirements on the quality of the investment manager. Added to this, through opening up the universe of derivative instruments and structured products, portfolio managers have developed a much greater pressure towards flexibility and speed of introducing new instruments/products into the investment process. At the same time, the requirements on the investment manager itself, in particular in the area of risk management, increase the demand both for breadth and quality of reference data available.

There is an increasing need for operational excellence, flexibility and speed; Asset Control concurs. Asset managers have seen their margins tighten and their inflows decrease over the past four years, yet compliance requirements have mushroomed. The trading environment remains volatile, hence the need for scalability and flexibility in any supporting systems to enable cost control alongside service. The middle-office data hub is a valid architecture for this.

The middle path on standardisation – for example, looking at delivery mechanisms – goes on but many argue that as far as the bigger picture is concerned, it is only when a CREST or a MiFID comes along, with a strong element of benevolent despotism that a pan-industry initiative can succeed. Other senior commentators believe that the committees presently discussing the way forward on both sides of the Atlantic have an opportunity which must be taken.

With compliance driving the need for cleansed data, and with standardisation still far away, there is no shortage of work for data management vendors. Investing alongside their users, they still represent the most practical way forward for an industry in constant transition.

Comments are closed.

Subscribe to get your daily business insights

Whitepapers & Resources

2021 Transaction Banking Services Survey
Banking

2021 Transaction Banking Services Survey

2y
CGI Transaction Banking Survey 2020

CGI Transaction Banking Survey 2020

4y
TIS Sanction Screening Survey Report
Payments

TIS Sanction Screening Survey Report

5y
Enhancing your strategic position: Digitalization in Treasury
Payments

Enhancing your strategic position: Digitalization in Treasury

5y
Netting: An Immersive Guide to Global Reconciliation

Netting: An Immersive Guide to Global Reconciliation

5y