Big Data is getting bigger and bigger
Every day we are bombarded by companies and media telling us that Big Data is the next big thing. But in truth Big Data, or the large quantities of information we collect on a daily basis, is already impacting our lives by changing the way we make decisions about buying goods and services. Ever buy something online and be told what would go well with the item you are viewing, or that other people who purchased this also bought…? All of this seemingly helpful advice is possible through the tracking, processing and analysis of Big Data so named for its complexity of volume, velocity and variety:
• Volume: In just 60 seconds a typical bank generates millions of financial transactions; over 200 million emails are sent; over 100,000 tweets are sent and 48 hours of new video are uploaded to YouTube.
• Velocity: Time-sensitive processes such as trading derivatives or catching fraud demand that data is available real-time, otherwise it loses its ability to deliver real advantage.
• Variety: Big Data is any type of data, whether machine-generated (structured) such as sensor data, financial transactions and log files or human-generated (unstructured) such as audio, photos and social media.
The amount of information we now collect about the market, our customers and our employees allows us to answer questions that were previously considered beyond reach. By combining different types of data and analysing them together, businesses are able to extract more value from data, gaining new customer insights and becoming more agile in their response to customer needs and new market opportunities.
However, storing, retrieving, analysing and processing all of this data in its multiple formats is no minor consideration when it comes to IT resource. When our favourite website suggests a product to us, it is based upon not only huge amounts of ‘data’ but also the existence of multiple systems that identify us, our device, our preferences and location, matching that data with historical buying patterns before identifying opportunities for upselling or cross-selling.
Traditionally all of those systems would be entirely standalone with information in silos, only accessible to administrators. When analysis was needed, a request would be made by the business and, a few days later, IT would provide the necessary raw data or report. This post-mortem approach is clearly at odds with the continuously evolving adaptive models modern businesses require for speed to market, efficiency and increased profitability. Those systems have to be integrated and have to be underpinned by an infrastructure that can scale up to support the ever increasing quantities of data that businesses collect on a daily basis. There are two ways you can approach this infrastructure challenge:
• Build the systems and processes internally and manage the ongoing and unpredictable costs
• Work with a service provider who can share the huge investments required to scale up in an agile and elastic way
Whichever route you choose, having lots of data does not equate to having insight or value; It isn’t about the amount of data we’re acquiring but the behavioural trends that we can identify and the revenue and efficiency that leads to. As Forrester points out in a recent article, this requires collaboration between IT and business to drive maximum insight. “Successful firms will meet this challenge through establishing a new level of collaboration between business and IT; developing new processes to deliver solutions; and mastering a rapidly evolving technology landscape.” The question is not whether to use Big Data, instead it is a question of which people, processes and platforms you choose to ensure the business stays ahead of the competition.