Cloud 2.0 and the new role of market data – Colt at DKF 2019

The Capital Markets industry is experiencing a fundamental transformation. As more applications and services move to the cloud, businesses are consuming more bandwidth, managing new services and having to analyse and store a rapidly increasing amount of data. Meanwhile, investor spending on financial data has jumped to post crisis records and big fund managers need to analyse potential trading signals while remaining compliant. This is fundamentally changing how they operate and do business.

To meet these needs, new technologies have been developed to help automate tasks and to interrogate the new quantities of data that would be impossible for humans to process. The tools and services to do this are hosted in the cloud, and for many businesses it’s no longer a question of if they’ll move to the cloud, but when.

The cloud ecosystem is the basis for advanced analytics, with hundreds of applications and coding APIs for decision support analytics and alpha generation. Platforms for sourcing new data sets are used by sophisticated hedge funds seeking unique, predictive insights, from market data and alternative data sets already cleansed and transformed into a usable format. The challenge today for those in the Capital Markets space is how to select and manage the right tools to set you apart from the competition.

Colt at DKF 2019

This is the question we’ll be trying to answer at the 2019 DACH+ Kongress für Finanzinformationen (DKF), held on 9 May in Munich. Colt is hosting a panel discussion ‘Market data and Cloud 2.0’ where we’ll be joined by Stephan Schmidt-Tank of Amazon Web Services, Luke Ryan from Morningstar and Dale Richards from RoZetta.

The advantage of cloud services is that they can be quickly ordered and provisioned, giving businesses a much greater degree of flexibility. AI can help spot problems, analyse complex unstructured data or automate time consuming tasks, while blockchain can be used to manage transactions by time stamp critical data and data provenance, providing an additional level of trust that a conventional database can’t match. Open source distributed processing frameworks like Hadoop technology can be used to manage big data to support advanced analytics initiatives including predictive analytics, data mining and machine learning applications.

Last year we held a well-attended discussion on how to bring data to the cloud. For 2019 we will cover what to do next, including the tools and services required to generate new insights for niche trading strategies, customer preferences and AI or machine learning.

The roundtable will take place from 11:15 – 12:15 in Boardroom B at DKF 2019. To secure your ticket for the event or to book a meeting with Colt, get in touch today.

Terence Chabe is Business Development Manager – Capital Markets, at Colt

What's your goal today?

1. Are you on the Colt IQ network?

Our network connects over 31,000 buildings worldwide powering companies such as Hitachi, Atos, Forbes, Arthur D Little, Brussels Airlines and thousands of others. Find out if you're Colt connected now.

2. Learn about digital infrastructure

We've written thousands of guides and white papers, regularly publish content on our blog and host regular events on everything from enterprise network connectivity, to cloud, digital transformation and the hybrid workforce.

3. Join our team

To learn more about joining our team of over 5000 people around the world, and to browse our current open roles visit