Cloud 2.0 and the new role of market data – Colt at DKF 2019

HomeBlogsCloud 2.0 and the new role of market data – Colt at DKF 2019

The Capital Markets industry is experiencing a fundamental transformation. As more applications and services move to the cloud, businesses are consuming more bandwidth, managing new services and having to analyse and store a rapidly increasing amount of data. Meanwhile, investor spending on financial data has jumped to post crisis records and big fund managers need to analyse potential trading signals while remaining compliant. This is fundamentally changing how they operate and do business.

To meet these needs, new technologies have been developed to help automate tasks and to interrogate the new quantities of data that would be impossible for humans to process. The tools and services to do this are hosted in the cloud, and for many businesses it’s no longer a question of if they’ll move to the cloud, but when.

The cloud ecosystem is the basis for advanced analytics, with hundreds of applications and coding APIs for decision support analytics and alpha generation. Platforms for sourcing new data sets are used by sophisticated hedge funds seeking unique, predictive insights, from market data and alternative data sets already cleansed and transformed into a usable format. The challenge today for those in the Capital Markets space is how to select and manage the right tools to set you apart from the competition.

Colt at DKF 2019

This is the question we’ll be trying to answer at the 2019 DACH+ Kongress für Finanzinformationen (DKF), held on 9 May in Munich. Colt is hosting a panel discussion ‘Market data and Cloud 2.0’ where we’ll be joined by Stephan Schmidt-Tank of Amazon Web Services, Luke Ryan from Morningstar and Dale Richards from RoZetta.

The advantage of cloud services is that they can be quickly ordered and provisioned, giving businesses a much greater degree of flexibility. AI can help spot problems, analyse complex unstructured data or automate time consuming tasks, while blockchain can be used to manage transactions by time stamp critical data and data provenance, providing an additional level of trust that a conventional database can’t match. Open source distributed processing frameworks like Hadoop technology can be used to manage big data to support advanced analytics initiatives including predictive analytics, data mining and machine learning applications.

Last year we held a well-attended discussion on how to bring data to the cloud. For 2019 we will cover what to do next, including the tools and services required to generate new insights for niche trading strategies, customer preferences and AI or machine learning.

The roundtable will take place from 11:15 – 12:15 in Boardroom B at DKF 2019. To secure your ticket for the event or to book a meeting with Colt, get in touch today.

Terence Chabe is Business Development Manager – Capital Markets, at Colt

Recent posts

Why SD WAN makes sense for multi-cloud infrastructure

Today, networks must be more intelligent and agile than ever. They must be able to support new applications ...
Continue Reading

Colt and Oracle expand Oracle Cloud Infrastructure FastConnect capabilities in the Netherlands

New cloud interconnect will make easier the on-going cloud migrations of enterprises globally with high bandwidth and on-demand ...
Continue Reading

What is and how to choose the right SD WAN

What is SD WAN? Why does it matter for your business and how do you choose the right ...
Continue Reading