THE CURSE OF MARKET DATA INFLATION

Data is simply amazing, the more it is used the more it creates, and then even more is used. Unfortunately with market data that comes with a cost. We estimate market data consumers over spend by between 25% and 35% on average, so if your bill is $10 Million per annum, it means $3 Million of potential wastage each year.

Banks and other financial institutions don’t pay for data because they want to, it is because they want to use market data to make money for themselves. It is a cost of business, and when the costs become too high it is time to cut. Right now financial institutions are under pressure and thanks to the curse of market data inflation it is the prime target for the concerned CFO.

There are 3 core reasons for increased market data costs over the last 15 years:

1.Growth in overall usage, especially enterprise wide and for automated processes

2.Data Sources (especially index creators and credit rating agencies), exchanges, vendors and other suppliers seeking and finding new ways to charge for usage

3.Regulators introducing new rules that inevitably increase data usage to meet higher standards, especially for reporting.

And this is all before the emergence of ESG data and Alt data.

We take a look at how we are strategising with our clients.

5 STRATEGIES TO SUCCESS

As we have already pointed out reducing market data spend is becoming harder to achieve than previously due to contractual obligations, the difficulty in removing data integrated into networked systems, and smaller organisations not having the internal experience, resources, or both to thoroughly optimise their market data costs.

Our approaches include:

1.While traditional cost cutting strategies are the right place to start, they are exactly that, the start. Projects require even more emphasis on the business’ relationship with its data usage. Cost management must be business driven

2.While clients will attempt to reduce their reliance on traditional suppliers, like Bloomberg, Refinitiv, venues, they are becoming more receptive to challenger vendors, sources, as they realise networked systems not people take priority in the decision process. The Cloud makes an impact

3.Analysing the balance between managing data inhouse and third party processing, which can include everything from valuations to algorithmic trading, while understanding the appetite for such strategies is going to vary from low to high

4.Establishing the opportunities to better use internal/self sourced data

5.Changes in strategic approaches to ‘Total Information Management’

We examine in more detail.

1.Building upon Traditional cost cutting strategies.

In the old days when a person left the business could just cut the terminal on the desk, but the growth of enterprise usage means the impact of this action in overall expenditure terms is reduced. Banks still need access to data to fulfil risk, valuations and other functions. The fact someone has left does not remove the data requirement. Even if a bank leaves a specific market it takes time to eliminate related obligations (Lehman’s is still going), especially as vendors and exchanges added more licences to cover such usage. This has been compounded by both regulatory requirements, especially MiFID2, and contractual agreements which often have multi-year terms.

Financial institutions adopting only old school cost cutting approaches find their attempts to be limited. The alternative is to step back and plan ahead using the various tools which now provide greater clarity on firmwide data usage.

Cost optimisation now becomes a strategic plan for the future.

2.Data Consumers attempt to reduce their reliance on traditional suppliers

There is still a lot of affinity towards data suppliers that are known and trusted on a personal basis, brands like Bloomberg, Refinitiv, and S&P Global. They are known quantities and qualities, but rarely are their services used to anything like the capabilities available, and that excess comes with a supplier’s premium.

Internally Non-display usage (NDU) decreases the personal bias, a risk system ‘cares’ about the right price data, not who supplies it. The brand premium disappears and the person on the front desk is happy when the outputs do their job. Externally the Cloud now provides a similar experience, the banks can select from a wider range of suppliers, often specialists and squeeze the excess data out of the system. From both ends, assuming no compromise in data quality, cost of service introduces a renewed competitive element that once favoured the automatic ‘I won’t get fired for buying IBM’ mentality.

Validate existing suppliers, evaluate alternatives.

3.Moving Data Processing to External Specialists

Banks are primarily interested in outputs and results. Ever greater automation places far more priorities on the models, analytics, and algorithms the drive the outputs than the data processing itself. For years Funds & Securities Administrators have built successful businesses on managing, i.e. processing portfolio valuations and related services because they can do this more efficiently.

The same is happening in the trading environment, for instance, while the smart order algos, risk engines, margin calculations can be developed inhouse a third party can often execute better and cheaper through better costs of data and specialisation by handling multiple accounts instead of just one in-house client. This third party processing and facilitation is especially attractive to asset managers, quant traders and market buy side as their data costs per person are significantly higher than the wholesale investment and commercial banks.

All about finding a balance, the challenge is getting it right.

4.Analyse the Potential of Internal/Self Sourced Data

This has been an objective of many financial institutions for some time, with varying degrees of success achieved. The problem has always been that the data and information generated from different markets and asset classes traded keeps ending up siloed in multiple locations, both networked and geographical, in often different, incompatible, formats. This creates significant workload and a necessity for resources to pull the all this data into usable data structures that can be turned into practical assets. A secondary problem is that Banks have focused on using proprietary data to meet regulatory requirements instead of exploring its full potential.

There is a realisation that this data can have commercial application, and some Banks like JP Morgan and UBS have market data sales subsidiaries offering niche market pricing services. However the greatest immediate value comes from use in terms of business and trading performance especially when evaluating the efficacy of models and analytics. Clients are now taking  more seriously start to discover and then access what internal data they possess, and then how it can be leveraged to benefit the business.

Better utilisation of internal data adds value and reduces cost.

5.Total Information Management’

Market data has always been at the vanguard in data ownership, intellectual property rights, and licences for usage. The growth in commercial data usage, alternative data, and ESG has expanded the data universe and these providers are learning, both positively and negatively, their business models from the market data world. In a world where similar ownership models are being employed across any form of medium transmitting information into the business, whether a financial institution or not, the management of that information becomes more of an issue.

We look at the management and control of information across the entire information spectrum, data, news, and related services. How market data is managed, the tools required (there are some first rate ones out there) and knowing/following the workflow process from delivery to decision making is critical.

Control. Manage. Report.

Understanding market data workflows leads to more efficient market data management and cost control

STRATEGIES FOR SUCCESS

Finding the balance between the utility of data subscribed to and its cost is a challenge because it is not always transparent, for example, an expensive item might only be used twice a year, but without it a high value client will walk.

One element that always gets forgotten is time, reducing costs is rarely immediate, and a dealer can be gone while the desktop terminals remains as a cost because a contract remains in place. Time horizons are needed to plan into the future.

The market data cost control process is an opportunity to re-balance in favour of future data requirements, not simply indulging in knee jerk cuts, because everything is cyclical. Smart preparation now will provide the right data tools for when the market recovers.

SUMMARY: OPPORTUNITIES

To obtain 26% (our average) annualised market data cost savings is built upon five pillars

1.Cost optimisation as a strategic plan for future usage reinforcing cost savings on a long term basis

2.Validate existing suppliers, evaluate alternatives

3.Finding balances, the challenge is getting it right

4.Better utilisation of internal data adds value and reduces cost

5.Understanding market data workflows to gain more efficient market data management and cost control

A well planned approach to market data cost optimisation will pay long term dividends.

 

We have and continue to consult on this importance subject matter. Contact us today to arrange for unbiased and independent advice.

Keiren Harris 20 June 2023

www.marketdata.guru

Please contact info@marketdata.guru for a pdf copy of the article

For information on our consulting services please email knharris@marketdata.guru

#marketdata  #marketdatacosts #financialinformation #referencedata #esgdata #altdata #costcutting