‘The Cloud’ really does look like being truly transformational in application, if not necessarily in concept. While an over-used and inevitably misused word it unlocks the doors to new and smaller players while reducing the reliance upon legacy internal systems, albeit with the danger of becoming reliant upon Cloud providers infrastructure.

As the Cloud moves from Storage & Distribution to Accessibility & Productivity it impacts four major segments in the data workflow system, including:

1.Data Sources. Exchanges, Brokers, Index Creators, ESG, Alt Data, which previously relied upon the aggregators for global reach can now reach wider audiences via Clouds by going direct. Certain exchanges able to offer trading platforms, analytics and other data processing capabilities can now reach wider audiences

2.Data Aggregators. The vendors rely upon their own infrastructure to deliver data and services. This has locked clients into long term contracts and legacy systems, yet this is balanced by vendors being able to supply and manage a substantial range of normalised data to financial institutions, most of whom cannot, and will not, in the future be able to replicate themselves cost effectively.

3.Data Processors. This is an unsung growth industry, where financial institutions once processed data inhouse (and many will continue to do so), independent third party processors can offer advanced tech, lower aggregated data acquisition costs to facilitate everything from trading to smart order routing, risk management, and post trade solutions. The Cloud changes existing productivity models where I believe it will have the greatest future impacts

4.Data Consumers. For this segment it is not about what they want from the data, that remains the same (to make money), it is all about where the data workflows now take place in terms of where and how they access the data, plus where they choose to process the data to generate the outputs required to successfully participate in global capital markets. The Cloud creates new data strategic choices

The Cloud is now foremost a Productivity play. Accessibility via direct market to market connectivity combined with data processing impacts choice and cost through introducing new-found flexibilities in access to, and operation of, market data and the provision of information.


New technology and infrastructure is changing the inter-relational dynamics between data sources, vendors, processors, and consumers.

Where once was one way data flows from source to vendor to consumer (usually also the processor) it increasingly blurs the lines because those one way flows become two way flows, then evolve into multi-directional flows as the inter-connectivity develops in complexity. For instance the biggest exchanges now offer processing services for any market data, not only their own, from any exchange or other venue, across multiple asset classes. Other analytics and technology facilitators can plug into the Cloud and the data within, thereby increasing choice because those services source data from a wide variety of other suppliers.

This environment promotes the existence of non-linear workflows in a positive way.

Big Data will prove to be an increasingly important component of this ‘Cloud Web’ as the analytical tools and models which makes Big Data valuable can be utilised more efficiently and more effectively within a ‘The Cloud’ environment.

The graphic below looks at these changes from a high level. Regrettably in the real world there are a number of different Cloud Environments each with overlapping datasets but also increasingly likely to contain unique information not available elsewhere. This forces the data consumer to engage with each Cloud depending upon data availability.

This situation promotes the existence of non-linear workflows, though in a negative way.


The immediate business impact is that this now dramatically levels the playing field between financial institutions market data vendors, and solutions providers. How?

  • Previously, smaller financial institutions have lacked the resources to fund access to expensive market data services, analytical tools, trading solutions and associated infrastructure. Going forward, these smaller players can select from a ‘restaurant or supermarket menu’ of data and solutions
  • On the data side accessibility is facilitated by going direct to the sources and opens the doors to vendors adopting new partnership and subscription models, for instance Crux Informatics (, NASDAQ’s Quandl (
  • Vendor distribution platforms, like TREP and Bloomberg, are naturally tailored to maximise the productivity of their own data and information services. Independent platforms from developers like BCC Group ( and Pegasus ES (https://Pegasus-es) aggregate multiple vendors and services within a neutral ‘plug and play’ environment utilising lower cost lightweight tech designed to be continuously upgraded, unlike legacy platforms
  • For any financial institution trying to do all data processing inhouse is expensive and resource intensive. What many financial institutions have not quite realised is they have already outsourced major trading, analytics, risk and post trade functionality to vendors, especially to Bloomberg which by accident has become its own Cloud. The growth in Cloud based facilitators offering these services (notably exchanges, DBAG, LSEG, NASDAQ, tech solutions providers like Broadridge, SS&C) allows the data consumer to focus on what they need for the entire trading cycle, i.e. the data outputs. This has also generated the development of a plethora of new facilitators processing data, algo driven smart order routing being a keenly competitive area. These new facilitators are providing services that are often developed internally by individuals or specific front desks meeting specific needs which previously could not be met from an external provider economically.

Greater productivity choices create relational changes as on one side it puts more dollar power in the hands of sources and providers of specialist, and proprietary data, while simultaneously offers greater leverage to the consumers where there is competition.


In 2017 I wrote the market data industry will be impacted ‘From above by the new entrants such as Google, Amazon, and Microsoft. They crave content for their financial services business, and will go around vendors direct to sources. They also bring far more resources and reach than even the biggest market data vendors’. Recent deals between Big Tech with exchanges and the likes of S&P have vindicated that prediction.

  • Access to more information will allow the Market Data Pie to grow driven by lower barriers to entry and the ability of more data sources to commercialise their data
  • As vendors like Bloomberg and Refinitiv find it harder to make money through adding incremental data sources (the major ones i.e. exchanges, brokers, index creators, news already onboard), they are resorting to charge suppliers for carrying their data. This is uneconomic for new data providers, and with data malls like IEX loading terms in their favour the only option is to go direct and do deals for delivery via the internet or the Cloud
  • Technological disintermediation is resulting in changes to market data business relationships, with more direct source to consumer connectivity, i.e. ‘Hub-Busting’
  • However, partnerships between Big Tech and the Big Exchanges are designed to drive data accessibility and processing of trades, analytics, risk, valuations across the trading workflow into their own proprietary Clouds, enhancing their status as Hubs. Unfortunately with the caveat that access to certain proprietary sets are likely to end up in one Cloud only, forcing the data consumer to access multiple Clouds
  • New technology is lowering barriers to delivery, in both cost and infrastructure terms
  • However market data Cloud nirvana is probably not achievable, because of the extent to which new technologies must co-exist with legacy infrastructure for the foreseeable future, especially in large institutions, because of 1. the already considerable investment made in existing tech, and 2. it is time and resource consuming to migrate into Clouds

Undoubtably each Cloud is an enabler, whether or not it is a Data Hub or Hub Buster, or both, depends upon what the participating parties, data sources, data aggregators, data processors and data consumers want to plug into it and then what they intend to retrieve from it. The Cloud should be about choice empowered by non-linear workflows.

Talk to us about our consulting, and our independent advisory services

Keiren Harris 18 July 2023

For information on our consulting services please email

Please contact for a pdf copy of the article

#exchanges #Banks #TheCloud #Bigdata #AWS #Microsoft #marketdata #data