Long live small data: five Insights from the #smalldataforum
24 May 2016 12:00 am by Leela Bozonelis
On Tuesday 17 May LexisNexis hosted The Small Data Forum. The event focused on how businesses look at data collected from customers to deliver actionable insight that informs business decisions.
Thomas Stoeckle, Global Head of Evaluation & Insights at LexisNexis Business Information Solutions (BIS) hosted the LexisNexis Small Data Forum event. Guest speakers Neville Hobson, Senior Business Consultant at IBM and Sam Knowles, Founder & Managing Director of Insight Agents, discussed the evolving role of data in business, and the insight that can be gleaned from it.
The primary theme throughout the day was how best to identify, isolate and combine different types of data to find the important nuggets of information that can shape better decision making: turning big data, into small data.
Another recurring theme, highlighted both by the speakers and the audience, was the growing relevance of data for a number of different roles within a business. Data analysis is no longer the remit of data scientists alone. Data driven business decisions are taking place throughout organisations, particularly in marketing, communications and social media departments.
While people in these roles have always been exposed to data in some form they now need to know how to gain business insight from it. Here are five insights from The Small Data Forum on how people should be looking at data, and why some common misconceptions may be holding people back from extracting intelligence from enterprise scale data sets:
1. The value of data is in the ingredients
Making data valuable is about mixing together the right ingredients in the right way. Similar to baking a cake, data insight consists of a variety of individual ingredients mixed together to present a contextualised result. As Sam Knowles summarized, "You can buy all the best ingredients, but if you don't know how to put them together in the right way and in the right order, and cook them for the right amount of time, you're always going to be closer to garbage than gâteaux."
Contextualised data analytics brings together a number of different types of data in a standard format, but context is only possible if the varying data types are consolidated and normalised. This allows the data to be presented in a meaningful format, matched against new data sets, and for the 'recipe' to be meaningful.
2. Big Data is both oil and soil
Data lends itself to the oil comparison because it follows a very similar process structure: upstream, midstream and downstream. Raw data is collected (upstream), aggregated and enriched (midstream), and then distributed and actioned (downstream).
More recently, the industry has been described as soil – an analogy that stresses the importance of enrichment to create a fertile ground that deeper meaning and insight can be 'grown' from.
The reality is that data is both. To ensure insight gained at the end of the data enrichment process continually enables people to make confident decisions, there needs to be constant feedback between each aspect. Confident decisions enable 'enrichment' opportunities. Generating business value from customer data for example, is less about the technology used to collect the data, and more about how the data is applied to deliver a more valuable customer experience.
3. A clever nut cracker can be more useful than a sledgehammer
Good insight does not necessarily come from collating and processing more and more data, but by being smarter about the data that is collected. Actionable insight does not always require an expensive processing system. Insight may even come from small amounts of data where the identification of meaningful connections can be just as useful as insight from big data.
4. A better understanding of the current environment helps shape the future
By identifying trends in historical data sets and then analysing this information against real time data feeds, a business can use understanding of data from the past to drive better decision making in the future. This can be implemented to develop new initiatives or drive improvement in efficiency. Analysing these initiatives creates a new data set which begins the cycle all over again. When this process is executed effectively, a business can search for and identify trends that will give it a crucial competitive advantage.
5. Algorithms are important, but human element cannot be replicated (yet)
Complex algorithms are fundamental to data analysis of large data sets, allowing humans to harness the processing power of machines to gain insight. Fundamentally data is simply streams of ones and zeroes. Without proper human intervention and analysis, it is meaningless. Algorithms can be programmed to decide whether something is positive or negative, in simple terms, but they cannot interpret deeper human emotions. While it is difficult to identify exactly at which point human intervention is needed in the data analysis process, it is clear the analysis of humans and machines works as "yin and yang".
For more on the topic read Sam Knowles blog 3 Clues that Big Data might be Old News #smalldataforum: https://www.linkedin.com/pulse/3-clues-big-data-might-old-news-smalldataforum-sam-knowles?trk=prof-post