First up on Market Data Guru’s 5 predictions for the world of market data in 2023 (https://lnkd.in/ghfVeVFn) is the question of data costs returning as an urgent and immediate issue. As we head towards uncertain times, financial institutions are cutting costs, and people. While controlling market data costs has always been a theme, if not a strategy, this year it is an imperative, and the first prediction on our list.

It is fun listening to market data managers at big institutions explaining how effectively they have costs under control. Few really do, and we estimate that data consumers over spend by between 25% and 35% on average. Equally as the demand increases axiomatically costs do to, especially when driven by regulatory fiat and other factors outside of a business’ control. Nevertheless, weakening profits will drive the bean counters to cut costs, and data being one of the largest is an obvious target.

However, this will prove harder to achieve than previously due to contractual obligations, the difficulty in removing data integrated into networked systems, and smaller organisations not having the internal experience to thoroughly optimise their market data costs.

Here are our 5 thoughts on cost cutting strategies:

1 Traditional cost cutting strategies are not going to work

2 Data Consumers will attempt to reduce their reliance on traditional suppliers, like Bloomberg, Refinitiv, venues, and be more receptive to challenger vendors, sources, as they realise networked systems not people take priority in the decision process. The Cloud makes an impact

3 Switch in open thinking from trying to manage all data inhouse to third party processing, i.e. everything from valuations to algorithmic trading, and new forms of outsourcing makes this possible

4 Greater emphasis on internal/self sourcing of data

5 Change in strategic approach to ‘Total Information Management’


1 Traditional cost cutting strategies are not going to work

In the old days when a person left the business could just cut the terminal on the desk, but the growth of enterprise usage means the impact of this action in overall expenditure terms is reduced. Banks still need access to data to fulfil risk, valuations and other functions. The fact someone has left does not remove the data requirement. Even if a bank leaves a specific market it takes time to eliminate related obligations (Lehman’s is still going), especially as vendors and exchanges added more licences to cover such usage.

This has been compounded by both regulatory requirements, especially MiFID2, and contractual agreements which often have multi-year terms. Financial institutions adopting old school cost cutting approaches will quickly find their attempts to be limited. The alternative is to step back and plan ahead using the various tools which now provide greater clarity on firmwide data usage.

2 Data Consumers will attempt to reduce their reliance on traditional suppliers

There is still a lot of affinity towards data suppliers that are known and trusted on a personal basis, brands like Bloomberg, Refinitiv, and S&P Global. They are known quantities and qualities, but rarely are their services used to anything like the capabilities available, and that excess comes with a supplier’s premium.

Internally Non-display usage (NDU) decreases the personal bias, a risk system ‘cares’ about the right price data, not who supplies it. The brand premium disappears and the person on the front desk is happy when the outputs do their job.

Externally the Cloud now provides a similar experience, the banks can select from a wider range of suppliers, often specialists and squeeze the excess data out of the system. From both ends, assuming no compromise in data quality, cost of service introduces a renewed competitive element that once favoured the automatic ‘I won’t get fired for buying IBM’ mentality.

3 Moving Data Processing to External Specialists.

Banks are primarily interested in outputs and results. Ever greater automation places far more priorities on the models, analytics, and algorithms the drive the outputs than the data processing itself. For years Funds & Securities Administrators have built successful businesses on managing, i.e. processing portfolio valuations and related services because they can do this more efficiently.

The same is happening in the trading environment, for instance, while the smart order algos, risk engines, margin calculations can be developed inhouse a third party can often execute better and cheaper through better costs of data and specialisation by handling multiple accounts instead of just one in-house client.

This third party processing and facilitation is especially attractive to asset managers, quant traders and market buy side as their data costs per person are significantly higher than the wholesale investment and commercial banks.

4 Greater Use of Internal/Self Sourced Data
This has been an objective of many financial institutions for some time, with varying degrees of success achieved. The problem has always been that the data and information generated from different markets and asset classes traded keeps ending up siloed in multiple locations, both networked and geographical, in often different, incompatible, formats. This creates significant workload and a necessity for resources to pull the all this data into usable data structures that can be turned into practical assets.

A secondary problem is that Banks have focused on using proprietary data to meet regulatory requirements instead of exploring its full potential. There is a realisation that this data can have commercial application, and some Banks like JP Morgan and UBS have market data sales subsidiaries offering niche market pricing services. However the greatest immediate value comes from use in terms of business and trading performance especially when evaluating the efficacy of models and analytics.

In 2023 Financial institutions will more seriously start to discover and then access what internal data they possess, and then how it can be leveraged to benefit the business.

5 Introducing ‘Total Information Management’

Market data has always been at the vanguard in data ownership, intellectual property rights, and licences for usage. The growth in commercial data usage, alternative data, and ESG has expanded the data universe and these providers are learning, both positively and negatively, their business models from the market data world.

In a world where similar ownership models are being employed across any form of medium transmitting information into the business, whether a financial institution or not, the management of that information becomes more of an issue.

This will encourage organisations to organise the management and control of information through stronger centralised business units dedicated just to information, data, news, and related services. Creating standalone information hubs, answerable not to procurement, not to IT, not to operations, that only focus on one thing, information, and responsible to, and working with the business.


What is notable is all the predictions are ultimately cost driven, but, should be applied (but probably won’t be) strategically in application.

Finding the balance between the utility of data subscribed to and its cost will be a challenge because it is not always transparent, for example, an expensive item might only be used twice a year, but without it a high value client will walk. The financial institution loses out.

Higher market data costs are turning once profitable markets into marginal investment opportunities, then the financial institution will walk. This time the vendors and exchanges lose out.

One element that always gets forgotten is time, reducing costs is rarely immediate, and a dealer can be gone while the desktop terminals remains as a cost because a contract remains in place. Data consumers rarely take into account time horizons for planning into the future.

The market data cost control process ought to be used as the opportunity to re-balance in favour of future data requirements, not simply indulging in knee jerk cuts, because everything is cyclical. Smart preparation now will provide the right data tools for when the market recovers.

In 2023 how much data consumers are willing to pay will underline how the year will progress both negatively and positively.

That is an intriguing prospect.

Keiren Harris 18 January 2023



Please contact info@marketdata.guru for a pdf copy of the article

For information on our consulting services please email knharris@datacompliancellc.com