Infographic with icons related to data

Highlighting differences: Big Data versus Analytics

“Big Data” and “Analytics” are still two major topics when it comes to IT. At the same time, they are often misunderstood disciplines, starting with their definitions. Big Data is not just about large amounts of data, as you might expect, but it’s also about the use of a variety of types of data and in some cases the speed at which data is emerging needs to be processed, to achieve insights that matter. 

The definition of ‘analytics’ is ambiguous today. With BI (Business Intelligence) slowly becoming an acronym of the past, it is fair to say that the term ‘analytics’ is used to encompass BI as well as new technologies that allow us to predict future events and ultimately prescribe responses to those events.

In our field, the combination of Big Data and analytics enables us to predict when a meeting room is available (with its probability), how many employees are coming to your facility tomorrow (with probability) and what their preference for lunch is (again: with probability). In the field of asset management, predictions of asset inefficiencies and failure can substantially reduce downtime and avoid unnecessary costs.

But first, let's go back to basics: What is Big Data, and, above all, what isn’t it? What is analytics and how to apply it?

Back to Basics

Gartner defines Big Data as: 

High-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.”

As expected, the term ‘big’ refers to large amounts of data (volume) according to this definition. Today we are storing almost incomprehensible amounts of data. To illustrate this: by 2020, it is expected that 40 zettabytes of data will be created, which is more than 300 times the amount of data created in 2006. However, this is only one component (and one of the ‘V’s) of the term ‘big’ in big data.

The other components Gartner refers to are Variety; the different manifestations or formats of data that are available, from video to static data, Velocity; the tremendous speed with which data becomes available, and last but not least: Veracity. This last ‘V’ refers to the uncertainty of data. Because there is so much data, everything becomes significant and it is harder to identify errors. However, according to most scientists, this is not a problem because you would still be able to draw representative conclusions from the data.

Analytics

There are many ways to analyse data. Organisations are gathering huge volumes of data today, especially in cases where IoT technologies are installed. The number of potential investigations into that data is overwhelming.

So, the question is: What to analyse? In fact, this question is not only fundamental for setting up analysis approaches, it is fundamental for the type of data to be acquired. When the capacity and occupancy of meeting rooms available needs to be analysed in an effort to predict future demand, one will need a summary of the number of meeting rooms, reservations and (if available) actual occupancy data over the past 15 years.

A sound approach to applying analytics typically has a number of elements in it:

  • Start with prioritising ‘business problems’
    Any (new) analytics activity involves investments (time, tools). Make sure the investment is put into the issues that will return tangible value.
  • Evaluate against data available
    Check whether sound data sets are available of analysis of the business problems in question.
    It may well be the case that for high priority problems, there is insufficient data to hand while another case can easily be analysed because relevant data is available. It really is a cost/benefit exercise.
  • Start small and iterate
    Go for ‘fast results’ that bring just a part of the value projected. In software engineering, this approach is called ‘minimum viable result’ or ‘minimum viable product’. Then, in small steps, extend the analytics. The experience is that this iterative approach is a source of ‘learning on the way’ – what works well, what does not? – allowing approaches to be changed in a timely manner.

You don’t know what you don’t know: good analytics projects will reveal answers to questions that you have not even asked. If you have access to a lot of data, business analytics enables you to go in search of possible correlations between various factors that are available in your datasets.

Data analytics refers to the processes of examining, collecting, analysing and transforming big data into useful information that is ‘hidden’ in the data. The goal is to draw conclusions and solve issues. However, analytics is not only a tool with which you can analyse an event from the past with data from the past, it is also possible to describe future patterns with today’s data. Gartner, therefore, separates analytics into four different categories:

  • Descriptive analytics: used to describe what the current situation is
  • Diagnostic analytics: gain insight about why particular events and patterns occur
  • Predictive analytics: as described earlier, this type of analytics can make predictions about future behaviour based on past data.
  • Prescriptive analytics: based upon future predictions and current data, this type of analytics advises you which actions you should take.

What are the differences?

In short, Big Data refers to unstructured and fairly complex data sets, which requires specific tools to handle this data. Analytics involves structuring that data in ways that it becomes input valuable to the process of analysis. The emergence of Big Data led to the emergence of a new role in data: the data analysts or data scientists. This is a specialism which focuses on knowing where to look and how to derive value from the Big Data sets.

Applying Big Data and Analytics

Although it is nice to have clear definitions of Big Data and analytics in mind, the next question that arises is how to apply these techniques within your organisation. How could your organisation benefit from the insights derived from analytics? What software or tools should you use in order to create value out of your data? These questions will be answered in an upcoming blog post. But the advice is: start small and manage complexity well – do not overcomplicate things at the start.

Would you like to know more about this topic or are you interested in what we can offer in terms of products to ‘handle’ data? Download our White paper or explore the options within our new product ‘Planon Connect for Analytics ’.

Dela den här artikeln