Financial traders are in a technological arms race. In data rich, financial markets and trading businesses, such as banks and hedge funds, technology enables traders and market analysts to stay ahead of ever-changing markets, and never more so than now.
Data analytics is crucial to achieve their ambitions. Organisations need to successfully integrate, manage and analyse large and complex data sets rapidly and sometimes in near real-time. Having worked with many trading businesses, we regularly see the obstacles that arise – be they teams or data sets that sit in different silos within the organisation or the many challenges companies face with the quality, volume and timeliness of data reporting and analysis.
At Uniper, an international energy generation and trading company, we helped the market analytics team to support anticipated high levels of future growth in data volumes and data sets, and the ever increasing demand for more rapid analysis, to inform decision making and control risk.
Like many trading businesses, Uniper was handling dizzying volumes of data – up to 15+ billion data points, 15+ TB of data, 270,000 time-series objects and more than 250,000 data sources. By deploying the Market Data Analytics Platform (MDAP), our Azure-based PaaS (Platform as a Service), which uses a Snowflake database to store market and derived data, we helped the team effect a real step change in the speed and quality of their data analysis and reporting.
The system was architected to enable Uniper to better correlate and model their near-real-time, and very diverse, data sets using a wider range of time series methodologies, such as event-based, multivariate and matrix time series. Granular control of data usage, encompassing authentication, authorisation and auditing, was made possible.
The greater flexibility and ease of customisation also facilitated high levels of user adoption amongst traders and analysts – often the single most important determinant of project success in a trading business. They were able to set up user-configurable reporting options, which allow for the scaling-up or down of reporting outputs according to business needs, and leverage powerful analysis tools, such as Tableau, Azure ML and Databricks, and APIs for future extensions to the platform.
We discovered that up to 90% of a data scientist’s time can be spent in data “wrangling” – which encompasses data selection, ingestion, validation, visualisation and schema creation. We helped streamline this process to ensure the subsequent analytics and modelling phases can occur faster. This is an important distinction to make because organisations are spending huge sums on data scientists – but get much less value and output than they expect because they have failed to take into account data wrangling.
Another crucial learning was how powerful Agile approaches can be in high-demand, high-risk, high-profile – and data rich – environments. By operating in small, self-sufficient teams, you can work iteratively to continuously assess and adapt activity, produce multiple prototypes for your partners, and drive the project forward pragmatically to achieve a series of both small and large goals.
Deploying Agile approaches, taking into account the importance of data wrangling, and building data analytics solutions that users actually need and want to use are all central to the best practice that can be adopted by any project. These learnings can be applied not only within financial services – which tend to be relativity advanced in data analytics, especially in the high-tech world of trading – but in any organisation and sector no matter how far behind the adoption curve.
Ian Murrin is the Founder and CEO of Digiterre