Cloud 2.0 and the new role of market data – Colt at DKF 2019

HomeBlogsCloud 2.0 and the new role of market data – Colt at DKF 2019

The Capital Markets industry is experiencing a fundamental transformation. As more applications and services move to the cloud, businesses are consuming more bandwidth, managing new services and having to analyse and store a rapidly increasing amount of data. Meanwhile, investor spending on financial data has jumped to post crisis records and big fund managers need to analyse potential trading signals while remaining compliant. This is fundamentally changing how they operate and do business.

To meet these needs, new technologies have been developed to help automate tasks and to interrogate the new quantities of data that would be impossible for humans to process. The tools and services to do this are hosted in the cloud, and for many businesses it’s no longer a question of if they’ll move to the cloud, but when.

The cloud ecosystem is the basis for advanced analytics, with hundreds of applications and coding APIs for decision support analytics and alpha generation. Platforms for sourcing new data sets are used by sophisticated hedge funds seeking unique, predictive insights, from market data and alternative data sets already cleansed and transformed into a usable format. The challenge today for those in the Capital Markets space is how to select and manage the right tools to set you apart from the competition.

Colt at DKF 2019

This is the question we’ll be trying to answer at the 2019 DACH+ Kongress für Finanzinformationen (DKF), held on 9 May in Munich. Colt is hosting a panel discussion ‘Market data and Cloud 2.0’ where we’ll be joined by Stephan Schmidt-Tank of Amazon Web Services, Luke Ryan from Morningstar and Dale Richards from RoZetta.

The advantage of cloud services is that they can be quickly ordered and provisioned, giving businesses a much greater degree of flexibility. AI can help spot problems, analyse complex unstructured data or automate time consuming tasks, while blockchain can be used to manage transactions by time stamp critical data and data provenance, providing an additional level of trust that a conventional database can’t match. Open source distributed processing frameworks like Hadoop technology can be used to manage big data to support advanced analytics initiatives including predictive analytics, data mining and machine learning applications.

Last year we held a well-attended discussion on how to bring data to the cloud. For 2019 we will cover what to do next, including the tools and services required to generate new insights for niche trading strategies, customer preferences and AI or machine learning.

The roundtable will take place from 11:15 – 12:15 in Boardroom B at DKF 2019. To secure your ticket for the event or to book a meeting with Colt, get in touch today.

Terence Chabe is Business Development Manager – Capital Markets, at Colt

iStock-509031122-Capital-Markets-1-720x440

Colt Technology Services

29 April 2019

 

Recent blog posts

Laying the foundations for 5G and the IoT

The fascinating paradigm we’re currently facing is that, as the world disconnects from cables and moves wireless and ...
Continue Reading

Colt to connect to Oracle FastConnect in Japan; as Oracle furthers its footprint in Asia

Colt Technology Services continues to drive secure, reliable, low latency and high bandwidth connectivity to the Oracle Cloud ...
Continue Reading

Franck Farigoul

I love astronomy and taking pictures of astronomical events – I actually did a presentation to my colleagues ...
Continue Reading