While ‘honesty statements’ for reporting still exist, Bloomberg’s B-Pipe being a prime example, manual trust in reporting is inadequate to meet the sheer avalanche of market data usage. Inefficiencies lead to inaccuracies resulting in higher costs.

For many years systems and processes that encompass data workflows from exchanges, sources, their distributors to subscribers have functioned more or less adequately in a linear world. In the age of the Cloud, reporting needs to embrace new, highly active, data processors and facilitators which move data productivity outside the data consumers own internal environment.

This has been incorporated into their services by leading independent solutions and data providers, for instance BCC Group, Pegasus ES, as well as new types of data nexus bringing together multiple data sources in a single environment, i.e. Crux Informatics.

Unfortunately many of the new types of data processors are simply unaware of how data licences work, or often choose to ignore them.

The Cloud is a good example of the evolution from single tasking in information and data to multi-tasking which led to the development of a twin track approach to data and by extension cost, but is now being re-engineered to cope with less structured flows.

1.Data Management, and,

2.Workflow & Process Management

Both are inextricably connected, and poor management in one is always reflected in, or results in, bad management in the other. Yet one should always take priority over the other. Good process management will deliver systems management, yet good systems management is dependent upon good workflow management.

Which brings us back to the mantra of ‘Control, Monitor, Report’.


The data workflow is rarely planned, it is all about getting the data from ingestion to end user consumption as easily as possible subject to the demands of required applications like trading engines, risk management all these days all the other.

It should be efficiency driven. The imperative is to work from the workflow, assess the resources required and build data governance structures around real world practicalities.

The diagram below provides a high level perspective


Understanding the basics of licences is a powerful aid to good data governance and a simpler life for administrators. However, the requirement for data usage reporting and licences is rarely the same. Herein lies the trap as the management of data is forced to adapt with data usage changes within more than one Cloud environments because the data consumer needs to come to grips with the fact not all the required datasets are going to be available in one place.

While we are currently at a plateau regarding data usage on a desktop level, i.e. terminals, personal electronic devices, what is currently evolving is the dynamic of how the end users access the market data required. For instance, Terminals, once closed environments, now have greater connectivity, but still tend to be viewed in individual isolation. This makes them relatively easy to control, monitor, and report.

Moving up in complexity comes network and enterprise usage within an interlinked and inter-dependent environment where market data is shared, re-distributed and used in multiple applications. The Cloud brings advantages because it increases accessibility to a wider range of data but the downside requires managing more data sources with their own unique licences, definitions, and cost structures. This now becomes a resource issue in personnel terms and infrastructure tools.

As market data processing for mission critical workflows like trading engines, risk management, post trade, reporting moves into the Cloud, who does what changes the dynamic of data licensing. Many data owners like exchanges, and vendors, have yet to adjust their business models to reflect new alternative workflows. This is opening up the potential for increased out of scope data usage because the consumers need access to the data, yet have nothing to reference their usage against because data sources have not properly updated those business models.

In a Cloud world to ensure cost control, the data consumer must emphasise combining the right tools for ‘Control, Monitor, Report’ with best practice management processes. Fortunately this has been recognised by the likes of the afore-mentioned BCC Group, Pegasus ES, and Crux Informatics who work with clients to provide independent data entitlement solutions. When evaluating new products and services in a Cloud renews the stress on management tools as a core requirement.


The Cloud is also attracting new players who do not understand, nor much care for, the existing market data licence structure that requires the current levels of controls, monitor, and reporting. They see these as irrelevant business barriers, ignoring that the existing system for all its many faults does function and provide the financial world with the information investors base their decisions upon.

These new players much prefer, a more dystopian ‘uberised’ world, and will eventually make impacts forcing change, but at what point and to what extent?

It gets harder to control, monitor, report, as inter-connectivity increases in an expanding user universe. Therefore, the step from managing market data in a relatively closed enterprise universe to a multi-linear one is truly challenging, and a potentially daunting task. These may be new challenges, however the application of known fundamental principles suitably tweaked to reflect real changes in workflow behaviour offers the most logical way forward.

There is no turning back the Cloud tide of data, perhaps a returning King Canute can entertain us with an appropriate metaphor.

We advise on market data strategies, sourcing, and costs. Contact us today to arrange for innovative unbiased advice.

Keiren Harris 03 August 2023


Please contact info@marketdata.guru for a pdf copy of the article

For information on our consulting services please email knharris@marketdata.guru

#marketdata  #marketdatacosts #datalicences #datamanagement #dataownership #thecloud #TRGScreen #MDSL