How Big is Your Data? A Big Data Primer for Busy Execs

PinIt
big data

Decision makers need to know their big data basics. It’s the first step to understanding if big data will be a successful investment or a waste of money.

The job of business decision makers has never been easy. In the era of big data evolvement, it became even tougher. It’s a challenge to define whether big data is a must-have for a success-oriented company or just a waste of money.

We have already written about what big data is. Here, we will recap its key features and focus on business-related issues like practical application examples, costs, and ROI on big data projects.

Remember That Big Data Isn’t Just Big

There is no universally recognized definition of big data. Some suffice with “big data is big,” though leaving the limits of “big” vague. Others explain the term through “3 Vs” – volume, variety, and velocity. We will define big data through its key features, both informational and technical.

For the informational aspect, we can compare big data to the log of events. Look at some actions of an online store visitor: they added a product to their cart, removed it, redeemed a promo coupon, left their comment about product X, clicked an ad, etc. Each of these actions is an event, which happened and cannot be changed.

See also: 10 big data facts you need to know

Big data can be incomplete. Yet, 100% accuracy is not required to show a general picture or identify a trend. If an e-commerce retailer wants to learn about their customers’ behavior, they shouldn’t bother if they failed to recognize what page a visitor had landed on before finding the product they liked. Knowing landing pages for most of the visitors allows the retailer to get a quite clear picture.

On the technical side, big data’s volume is constantly increasing. With this challenge in mind, a big data solution has to show a special approach to data storage and processing.

That is why the following technical features are critical:

  • Distributed data storage, with data stored on multiple standard computers, instead of one super-powerful and therefore extremely expensive machine.
  • High scalability or the ease of adding extra computers to store and process data if its volume increases.
  • Parallel data processing dividing tasks among multiple processors with the objective of running them in less time.

Some Big Data Use Cases

Big data is diverse — it can be either a part of traditional reporting or an autonomous system, like a recommendation engine. It can rely on either internal or external data sources or a combination of both. Let’s take a look at some different scenarios.

Big data as a part of reporting

Companies always strive to understand their customers better. Big data opens new possibilities in achieving this goal by allowing the analysis of omnichannel customer behavior, sentiments, feedback and more. These findings result in customer segmentation, which is then used in traditional BI. Customer segments turn into one of the parameters to look at while doing sales and marketing analytics.

Big data as an autonomous system

A recommendation engine is the example of big data as an independent system – the one prompting customers products on the e-commerce website, or the one informing a user about the people they may possibly know among other social media users. Human involvement is not needed, as the system analyzes a plethora of data about each of the users and recognizes behavior patterns.

See also: 12 big data careers worth exploring

Big data from internal data sources

Manufacturers widely use sensors to improve asset maintenance. Usually, there are multiple sensors installed on different machinery parts. They scan the state of equipment and supply the analytical system with real-time observations. With time, the system learns a pattern that may signify a future machinery failure and signals the maintenance team if identifies it.

Big data from external data sources

Companies can also turn to external big data sources. For example, Nestle heavily relies on social media to monitor in real time their customers’ sentiment for each of their 8,000 brands. The team runs a non-stop analysis of every comment and post with any Nestle brand mentioned. They don’t only observe: thanks to real-time visualization tools, the team can easily spot when their involvement is needed.

What Does a Big Data Project Cost…and Save?

Understanding that pricing is an important issue, we’ve included this section in the description of big data basics. An indicative cost of a big data project is $200,000 – $600,000, though an actual price strongly depends on business tasks and, consequently, on the scope of the project.

It’s uncommon for businesses to display their ROI figures. Even if public companies do so, they rather indicate a total figure than ROI for each project separately. That is why we will share publicly available findings only. A recent Big Data Executive Survey 2017 showed that 80.7% of the respondents considered their big data investment successful (alas, they did not reveal exact figures), while 21% perceived big data as disruptive or transformational.

Despite this positive picture, business decision makers should remember that it will be possible for a company to understand the required investment and expected benefits after they have explained their short- and long-term requirements to the big data consulting team and have their existing systems audited.

We have described the key features of big data – both informational and technical. Looking at sample big data applications with internal and external data sources involved, we have shown how it can contribute to a business’ success. We have also touched the financial side and shared an indicative project price together with our considerations about ROI. With this information at the readers’ disposal, we hope that they will be able to decide whether big data will be a successful investment or not.

Alexander Bekker

About Alexander Bekker

Alexander is a Head of Database and BI Department at ScienceSoft. With 18 years of experience, Alexander focuses on BI solutions (data-driven applications, data warehouses and ETL implementation, data analysis, and data mining) in retail, healthcare, finance, and energy industries. He has been leading such large projects as private-labels product analysis for 18,500+ manufacturers, a global analytical system for luxury vehicle dealers and more.

Leave a Reply