My own concept of data and historical data is to “Think of every data point as an event, each representing a single action of human activity, and a series of historical data is the sum of that activity’.

If we assume every tick is a new event, then everything that comes before it is the past, so in historical data terms we are thinking both delayed data and also time series. To many exchanges, the past is forgettable, and therefore so is the data’s value. We shall see why this is in error.

Currently data is priced and valued by who is paying for it, and financial institutions have lots of cash. This has traditionally placed a premium on the data closest to the point a trade takes place, and then the pre- and post-trade processing that takes place, such as risk management, margin, and portfolio valuations. Historical data becomes an after-thought.

Exchanges, data sources, and to a lesser extent vendors have predicated their pricing models and revenue streams around a single type of organisation, the financial institutions, using data in the same way, as in the past, now, and in the future.

Well that model and thinking is obsolete for three reasons:

1.The smarter financial institutions now have far more sophisticated models and analytics driven by the sum of trading events, i.e. historical data, where the execution point as to when a trade, or series of trades, is/are predetermined by the model. The real time price then comes into the equation when the relevant signal is hit. The real time price is the trigger, not what is pulling the trigger, and,

2.Where investors for instance, retail, are not so time sensitive, real time data becomes less of necessity, prices need to be relatively close to the market, where real time snapshots come into play for execution. Although the value of each such transaction may be smaller, the same level of processing is still required for every trade. As the retail investor is external to say a broker, the broker needs to get that price information to him/her, which means mass price distribution, usually via the internet along with historical charts. Often all free.

3.Growth of third party processors for services where non-real time data is not required. These organisations often exist out of the line of sight of data sources.


Delayed data is an afterthought in the data delivery data, and despite this delayed data usage is ubiquitous and increasing, just the people who own the data do not see it (usually because they do not look) or they can’t see it.

While delayed data is historic data, for commercial purposes we should consider it non-real time delivered after 15 minutes (this is standard, though some markets have a 20 minute delay). The key word is ‘after’ because data owners have missed the change in definition and how the delay is applied.

  • Traditionally delayed data was a price that just updated every 15 minutes, so firmly out of the market, and only of use for indicative and reference purposes. It was understandable this data was provided free, the data consumers gained little benefit, and really needed access to real time data for market participation
  • Smart Tech, now takes tick by tick and continuously streams updates of prices exactly 15 minutes after the real time update. This transformation dramatically increases the value of delayed data because it becomes more user and analytics friendly to a wider audience

This not so subtle change to Delayed data delivery is creating new markets and new usage cases, for instance,:

  • Within financial institutions as a source for risk management, collateral management, valuations because 15 minutes is almost as good as real time for many non-direct trading purposes
  • Many institutions have now started to outsource the processing of the above types of data to external parties, a trend that has roots in wealth management, and is spreading outwards
  • For retail investors streaming prices are somewhat irrelevant because they cannot directly execute upon them as they need a third party to do the trade. 15 minute delays are fine, and cheap real time non streaming snapshots keep costs to a minimum when employed
  • The increase in delayed data, and especially charting on websites catering to the mass market, many supporting their retail clients, others correlating to products like CfDs, some sophisticated like Trading View, plus multiple news, media, commentary and research outlets


Historical data has not been an afterthought per se, it has usually been given away free as part of the real time feed, and to be fair it has only been more recently recognised by both consumers and data owners that it is not only an important part of the investment process, it facilitates markets in a variety of other ways.

Though we do need to distinguish the levels of historical data, which is a reflection of, rather than mirror image of real time data segmentation.

  • Level 1 End of Day Time Series, the basic entry level data used for price histories basic charting and analysis
  • Level 2 Tick by Tick Time Series, more valuable data suitable for analysing and building trading analytics and models
  • Level 3 Tick by Tick Time Series with Bid/Ask history. The gold standard, because the more data that a model can use, the better the decision outputs are likely to be. BMLL’s services are well worth looking at

What the historical data provides are insights on the journey an instrument, like BAe Systems, has taken to reach the price it currently trades at. It is the sum activity, i.e. events, and this allows the smart investor to analyse trends and impacts encountered along the route by highlighting turning points which could impact an individual instrument, specific market segments (for BAe-defence), the market as a whole (UK), and/or global markets.

Each journey’s individual route map provides two stand out usage cases:

1.For traders building advanced models which provide a better understanding of market dynamics and identifying trading opportunities, and,

2.For internal and external market surveillance and compliance. Regulators adore Level 3 historical data because they now have the ability (and proven it) to identify collusion from even the most abstract data relationships


  • Data owners are missing the transition in how historic and delayed data have become revenue generators in their own right. The reasons are readily identifiable, new data providers are successfully monetising delayed data for their own purposes, simultaneously creating new markets
  • Existing pricing models and contracts clauses covering historic and delayed data remain based upon legacy assumptions of its use and value while ignoring events in the real world. Basically, its forgotten
  • Because delayed and historic data are not considered front rank services the technical and administrative infrastructure is not in place to properly identify usage at either the exchange or vendor level
  • Having given away delayed and historical data for so long, it is hard, though not impossible for the innovative data owner, to create commercial models because so much of the historical data is already floating around in the datasphere


Historical and Delayed data are both undervalued commodities because of the preconceptions of data sources and to a lesser extent their distributors. This is based around a lack of either research or understanding of what is happening with, and to, their data in the real world and the many imaginative ways businesses are finding to make money using the data for their own revenue gain.

Financial markets maybe at the apex of the market data food chain in terms of money spent, sophistication and intensity of usage, but there is an incredible volume of activity laying below. This delayed and historical activity may not superficially appear be as valuable, as interesting, or fit into traditional business and pricing models, however, what it lacks in margin, it more than makes up for in quantity.

Also ignored is that many businesses eventually develop, progressively upgrading from historical and delayed to real time data, until a point where even the retail investor in the street starts subscribing to premium products. By ignoring the value of delayed and historical data the data owners and their vendors lose line of sight on these prospective new clients, foregoing both short and medium term fee generating opportunities by failing to engage in constructive relationships.

This article’s analysis is based upon working with entities across the value chain from data sources to consumers and seeing the myriad ways, existing and new data consumers now employ to build and expand their services and offerings in closed and multi-media environments.

Conclusion. Data is either valuable or not valuable depending upon who is using it. The purpose is invariably to make money either directly or indirectly, data consumed for no reason is a waste. Historical and delayed data consumers are prime sources for positive business engagement should data owners, sources, and vendors wish to avail themselves.

Keiren Harris 13 July 2022

Please contact for a pdf copy of the article

For information on our consulting services please email