Cloud 2.0 and the new role of market data – Colt at DKF 2019
Published by Stuart Brameld on May 31, 2019
The Capital Markets industry is experiencing a fundamental transformation. As more applications and services move to the cloud, businesses are consuming more bandwidth, managing new services and having to analyse and store a rapidly increasing amount of data. Meanwhile, investor spending on financial data has jumped to post crisis records and big fund managers need to analyse potential trading signals while remaining compliant. This is fundamentally changing how they operate and do business.
To meet these needs, new technologies have been developed to help automate tasks and to interrogate the new quantities of data that would be impossible for humans to process. The tools and services to do this are hosted in the cloud, and for many businesses it’s no longer a question of if they’ll move to the cloud, but when.
The cloud ecosystem is the basis for advanced analytics, with hundreds of applications and coding APIs for decision support analytics and alpha generation. Platforms for sourcing new data sets are used by sophisticated hedge funds seeking unique, predictive insights, from market data and alternative data sets already cleansed and transformed into a usable format. The challenge today for those in the Capital Markets space is how to select and manage the right tools to set you apart from the competition.
Colt at DKF 2019
This is the question we’ll be trying to answer at the 2019 DACH+ Kongress für Finanzinformationen (DKF), held on 9 May in Munich. Colt is hosting a panel discussion ‘Market data and Cloud 2.0’ where we’ll be joined by Stephan Schmidt-Tank of Amazon Web Services, Luke Ryan from Morningstar and Dale Richards from RoZetta.
The advantage of cloud services is that they can be quickly ordered and provisioned, giving businesses a much greater degree of flexibility. AI can help spot problems, analyse complex unstructured data or automate time consuming tasks, while blockchain can be used to manage transactions by time stamp critical data and data provenance, providing an additional level of trust that a conventional database can’t match. Open source distributed processing frameworks like Hadoop technology can be used to manage big data to support advanced analytics initiatives including predictive analytics, data mining and machine learning applications.
Last year we held a well-attended discussion on how to bring data to the cloud. For 2019 we will cover what to do next, including the tools and services required to generate new insights for niche trading strategies, customer preferences and AI or machine learning.
The roundtable will take place from 11:15 – 12:15 in Boardroom B at DKF 2019. To secure your ticket for the event or to book a meeting with Colt, get in touch today.
Terence Chabe is Business Development Manager – Capital Markets, at Colt
Recent articles
What's your goal today?
1. Connect to the Colt network
Our network directly connects 32,000+ buildings, with millions more through our extensive worldwide partners, powering global businesses with high bandwidth requirements. Find out if you're Colt connected now.
2. Learn more about digital infrastructure
We've worked with experts to build hundreds of guides, whitepapers and blogs across a range of technology & infrastructure topics, as well as videos, webinars & lightning talks. Find out more about them below.
3. Explore our customer success stories
We work with global businesses to deliver world-class connectivity solutions, with a range of available professional & managed services to help you get exactly the right fit for your business. Read more about some of our customers' success stories.