Use your data to prove the value of corporate real estate and facility management
This 11-page e-book helps you decide what – and how – to measure, analyse and report on real estate and facility management data.
Read more“Big Data” and “Analytics” are still two major topics when it comes to IT. At the same time, they are often misunderstood disciplines, starting with their definitions. Big Data is not just about large amounts of data, as you might expect, but it’s also about the use of a variety of types of data and in some cases the speed at which data is emerging needs to be processed, to achieve insights that matter.
The definition of ‘analytics’ is ambiguous today. With BI (Business Intelligence) slowly becoming an acronym of the past, it is fair to say that the term ‘analytics’ is used to encompass BI as well as new technologies that allow us to predict future events and ultimately prescribe responses to those events.
In our field, the combination of Big Data and analytics enables us to predict when a meeting room is available (with its probability), how many employees are coming to your facility tomorrow (with probability) and what their preference for lunch is (again: with probability). In the field of asset management, predictions of asset inefficiencies and failure can substantially reduce downtime and avoid unnecessary costs.
But first, let's go back to basics: What is Big Data, and, above all, what isn’t it? What is analytics and how to apply it?
Gartner defines Big Data as:
“High-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.”
As expected, the term ‘big’ refers to large amounts of data (volume) according to this definition. Today we are storing almost incomprehensible amounts of data. To illustrate this: by 2020, it is expected that 40 zettabytes of data will be created, which is more than 300 times the amount of data created in 2006. However, this is only one component (and one of the ‘V’s) of the term ‘big’ in big data.
The other components Gartner refers to are Variety; the different manifestations or formats of data that are available, from video to static data, Velocity; the tremendous speed with which data becomes available, and last but not least: Veracity. This last ‘V’ refers to the uncertainty of data. Because there is so much data, everything becomes significant and it is harder to identify errors. However, according to most scientists, this is not a problem because you would still be able to draw representative conclusions from the data.
There are many ways to analyse data. Organisations are gathering huge volumes of data today, especially in cases where IoT technologies are installed. The number of potential investigations into that data is overwhelming.
So, the question is: What to analyse? In fact, this question is not only fundamental for setting up analysis approaches, it is fundamental for the type of data to be acquired. When the capacity and occupancy of meeting rooms available needs to be analysed in an effort to predict future demand, one will need a summary of the number of meeting rooms, reservations and (if available) actual occupancy data over the past 15 years.
A sound approach to applying analytics typically has a number of elements in it:
You don’t know what you don’t know: good analytics projects will reveal answers to questions that you have not even asked. If you have access to a lot of data, business analytics enables you to go in search of possible correlations between various factors that are available in your datasets.
Data analytics refers to the processes of examining, collecting, analysing and transforming big data into useful information that is ‘hidden’ in the data. The goal is to draw conclusions and solve issues. However, analytics is not only a tool with which you can analyse an event from the past with data from the past, it is also possible to describe future patterns with today’s data. Gartner, therefore, separates analytics into four different categories:
In short, Big Data refers to unstructured and fairly complex data sets, which requires specific tools to handle this data. Analytics involves structuring that data in ways that it becomes input valuable to the process of analysis. The emergence of Big Data led to the emergence of a new role in data: the data analysts or data scientists. This is a specialism which focuses on knowing where to look and how to derive value from the Big Data sets.
Although it is nice to have clear definitions of Big Data and analytics in mind, the next question that arises is how to apply these techniques within your organisation. How could your organisation benefit from the insights derived from analytics? What software or tools should you use in order to create value out of your data? These questions will be answered in an upcoming blog post. But the advice is: start small and manage complexity well – do not overcomplicate things at the start.
Would you like to know more about this topic or are you interested in what we can offer in terms of products to ‘handle’ data? Download our White paper or explore the options within our new product ‘Planon Connect for Analytics ’.