The term ‘big data’ emerged as somewhat of a buzzword a few years ago, with industry experts sharing their visions for how the data explosion would transform the way businesses engage with their customers, develop products and drive more revenues. The initial hype has since subsided and organisations across industries – financial services, retail and manufacturing, for example – gather both structured and unstructured data from a multitude of sources and analyse this data in near real-time. With millions of data points at their fingertips, businesses are able to glean more in-depth insights on their operations than ever before, empowering them to make decisions faster, and to stay ahead of the game. As the proliferation of big data technologies continues, we are seeing more and more enterprises looking to move their data-intensive services and applications to the cloud. Our recent Tech Deficit study found that the adoption of Infrastructure-as-a-Service and Software-as-as-Service is set to grow by over 50% over the next two years in Europe. The combination of big data applications and cloud services is generating an influx in traffic between data centres. It has been estimated that by 2018, cloud services and applications will generate 76% of the world’s 8.6 zettabytes of data centre traffic. In case you were wondering, 8.6 ZB equals about 9 trillion hours of streaming HD content online. But, as more and more businesses realise the benefits that data analytics can bring and their reliance on different cloud-enabled data applications grows, using the public Internet to access cloud services is not viable. Common concerns include lack of control over connectivity, bandwidth and security.So how can cloud service providers ensure that there will be no bottleneck between their cloud platforms in different data centres, and lower the barrier for enterprises to migrate their data-intensive services and applications to the cloud? Recognising the shortcomings of the public Internet, cloud service providers such as Amazon Web Services, Microsoft and VMware are tapping into carrier-grade connectivity for the delivery of network based private and hybrid private/public cloud solutions. Carrier Ethernet, with multiple 10Gbs or 100Gb of bandwidth is emerging as the dominant technology used to allow for seamless replication of huge data volumes between distributed data centres in real-time. Services such as Dedicated Cloud Access from Colt allow organisations to make the most of cloud services with the same confidence as they would using private network connections such as Ethernet or IP-VPN – eliminating completely issues around bandwidth, resilience and security.As our enterprise customers look to reap the benefits of cloud-powered big data applications, we will continue to work with cloud service providers to boost the security, performance and reliability of their cloud infrastructures with our Dedicated Cloud Access service. With more than 430 connections to European data centres via our high bandwidth network, we help cloud service providers ensure that their platforms are equipped with resilient, robust connectivity, encouraging more European organisations take the plunge with public, private and hybrid cloud services.I will be at Metro Ethernet Forum Gen14 event this week which focuses on the merger of carrier Ethernet and cloud services. I will take part in a panel discussion entitled “Network-enabled cloud Services: expanding options for the digital enterprise”. If you’re in Washington, come hear how more robust connectivity options and network based private and hybrid cloud solutions are empowering more organisations across industries to migrate their data-intensive applications to the cloud.