Infographic with icons related to data

What is the Difference? Big Data vs. Analytics

“Big Data” and “Analytics” are still two major topics when it comes to IT. At the same time, they are often misunderstood disciplines, starting with their definitions. Big Data is not just about large amounts of data, but it’s also about the variety of data and the speed at which data is emerging Big Data is used to achieve insights that matter. 

The definition of “analytics” is evolving. With BI (Business Intelligence) slowly becoming an acronym of the past, it is fair to say that the term “analytics” is now used to encompass BI as well as new technologies that allow us to predict future events and ultimately prescribe responses to those events.

In our field, the combination of Big Data and analytics enables us to predict when a meeting room is available (with probability), how many employees are coming to your facility tomorrow (with probability) and what is their preference for lunch (again, with probability). In the field of asset management, predictions of asset inefficiencies and failure can substantially reduce downtime and avoid unnecessary costs.

First, let's go back to basics. What is Big Data, and above all, what isn’t it? Also, how do we define analytics and how do we apply it?

Back to Basics

Definition of Big Data according to Gartner:

“High-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.”

As expected, the word “big” refers to large amounts of data (volume) according to this definition by Gartner. Today we are storing incomprehensible amounts of data. To illustrate this; by 2020, it is expected that 40 zettabytes of data will be created, which is more than 300 times the amount of data created in 2006. However, this is only one component (and one of the “V”s) of the term “big” in big data.

The other components Gartner refers to are Variety; the different manifestations or formats of data that are available, from video to static data, Velocity; the tremendous speed with which data becomes available, and finally: Veracity. This last “V” refers to the uncertainty of data. Because there is so much data, everything becomes significant and it is harder to identify errors. However, according to most scientists, this is not a problem because you would still be able to draw representative conclusions from the data.

Analyzing Data: What to look at?

There are many ways to analyze data. Organizations are gathering huge volumes of data today, especially in cases where IoT technologies are installed. The number of potential investigations into that data is overwhelming.

Now we can ask: What to analyze? In fact, this question is not only fundamental for setting up analysis approaches, it is fundamental for the type of data to be acquired. When the capacity and occupancy of meeting rooms available needs to be analyzed to predict future demand, one will need a summary of the number of meeting rooms, reservations, and (if available) actual occupancy data over, for example, the past 15 years.

A sound approach to applying analytics includes several elements:

  • Start with prioritizing “business problems”
    Any (new) analytics activity involves investments; such as time and resources. Make sure the investment is put into the issues that will return tangible value.
  • Evaluate against data available
    Check whether sound datasets are available for analysis of the business problems in question.
    It may be that for high priority problems, there is insufficient data at hand while another case can easily be analyzed because relevant data is available. It really is a cost/benefit exercise.
  • Start small and iterate
    Go for fast results that bring just a part of the value projected. In software engineering this approach is called “minimum viable result” or “minimum viable product.” Then, in small steps, extend the analytics. The experience is that this iterative approach is a source of “learning on the go” – what works well, what does not? – allowing approaches to be changed in a timely manner.

You don’t know what you don’t know. Good analytics projects will reveal answers to questions that you have not even asked. If you have access to a lot of data, business analytics enables you to go in search of possible correlations between various factors that are available in your datasets.

Data analytics refers to the process of examining, collecting, analyzing and transforming big data into useful information. The goal is to draw conclusions and solve issues. However, analytics is not only a tool with which you can analyze an event from the past, it is also possible to describe future patterns with today’s data. Gartner, therefore, separates analytics into four different categories:

  • Descriptive analytics: used to describe what the current situation is
  • Diagnostic analytics: gain insight about why particular events and patterns occur
  • Predictive analytics: as described earlier, this type of analytics can make predictions about future behavior based on past data.
  • Prescriptive analytics: based upon future predictions and current data, this type of analytics advises you which actions you should take.

What are the differences?

In short, Big Data refers to unstructured and complex data sets, which requires specific tools to handle this data. Analytics involves structuring that data to better understand its value.
The emergence of Big Data led to the evolution of new roles: data analysts or data scientists. Data Analysis involves a strategic process which focuses on knowing where to look and how to derive value from Big Data.

Applying Big Data and Analytics

Although it is nice to have clear definitions of Big Data and Analytics in mind, the next question that arises is how to apply these techniques within your organization. How could your organization benefit from the insights derived from analytics? What software or tools should you use in order to create value out of your data? These questions will be answered in an upcoming blog post. My advice would be to start small and manage complexity well – do not overcomplicate things when you first introduce these techniques.

Would you like to know more about this topic or are you interested in what we can offer in terms of data products? Download our White Paper or explore our new product “Planon Connect for Analytics”.

Erik Jaspers is Global Product Strategy Director.

Erik Jaspers

Global Product Strategy Director

For the past 24 years, Erik Jaspers has worked for Planon in several leadership positions focused on the development of Planon’s software solutions. Erik is a member of the IFMA EMEA Board and is an IFMA Fellow.

Share this article