Top 5 Big data trends revisited

Hitachi Data Systems Corporation (HDS). (Image source: Google/
Hitachi Data Systems Corporation (HDS). (Image source: Google/
Hitachi Data Systems Corporation (HDS). (Image source: Google/

As we head deeper into the second half of 2015, Wayne Dick, Pre-Sales Manager at Hitachi Data Systems, Sub-Saharan Africa, revisits this year’s big data trends and considers how a streamlined data refinery architecture can help businesses to better process their information.

The volume, velocity and variety of data has exploded in recent years. Research suggests that more than 20% of organisations use at least 20 data sources, both structured and unstructured, yet putting this data to business use is still challenging.

Trends revisited
A few months ago, Hitachi Data Systems announced that it has acquired Pentaho, a leading data integration, visualisation and analytics company.

At the beginning of the year, Pentaho outlined five emerging big data trends that were helping businesses to use increasingly large and varied data sets to deliver high and sustainable return on investment (ROI). It’s safe to say that we are no longer in the emerging stages, with most of these developments taking hold in organisations.

1. Big data meets the big blender
The use cases that deliver the highest ROI require the blending of unstructured data with more traditional relational data.

For example, to get a 360-degree view of customers, the de facto reference architecture involves the blending of relational/transactional data detailing what the customer bought, with unstructured weblog and clickstream data highlighting customer behaviour patterns around what they might buy in future, as well as social media data describing sentiment around the company’s products and customer demographics.

This ‘big blend’ is fed into recommendation platforms to drive higher conversion rates, increase sales and improve customer engagement.

2. Internet of Things will fuel the new ‘industrial Internet’
This year, we have seen big data starting to transform industrial businesses by delivering operational, strategic and competitive advantage.

Smart factories, for example, are embracing the Internet of Things (IoT) and the so-called Industry 4.0 to become more flexible, resource efficient, ergonomic and integrated with customers and business partners. The machine data generated from devices and sensors in these smart settings are being defined by the IoT and require big data analytics to fuel key opportunities.

3. Big data gets cloudy
As companies with huge data volumes seek to operate in more elastic environments, many are exploring how their big data requirements can benefit from cloud infrastructures. This suggests that the cloud is now “IT approved” as a safe, secure and flexible data host.

4. Embedded analytics is the new business intelligence
Business users are increasingly consuming analytics embedded within applications to drive faster, smarter decisions. Gartner research estimates that more than half of enterprises that use business intelligence (BI) now use embedded analytics.

5. Data refineries give real power to the people
In the face of exploding volumes of structured transaction, customer and other data, traditional extract, transform, load (ETL) systems slow down, making analytics unwork­able.

While these processes result in safe, clean and trustworthy data, they need to be made fast, easy, highly scalable, cloud-friendly and accessible to business. This is where streamlined data refinery comes into play.

Streamlined data refinery
When it comes to extracting value from big data, basic batch reporting doesn’t cut it anymore. Users want to explore analytics on demand, in their preferred format, in the context of other software applications they use every day.

While data visualisation tools have helped line-of-business groups to help themselves, only some of the demands have been met. Users either have access to only a subset of data that they don’t entirely trust, or they simply cannot access the data they want, when they want it.

Streamlined data refinery is a first step towards governed data delivery, or the delivery of blended, trusted and timely data to power analytics at scale regardless of data source, environment or user role.

Streamlined data refinery enables highly regulated industries like financial services, healthcare and energy to prove they comply with government regulations. This often requires them to combine data from multiple sources, run statistics and prove that their data management practices meet specific standards.

By having access to complex, trusted and compliant data sets on demand, in a timely fashion, in a way that is blended and easily consumable, organisations will be better positioned to respond to changing conditions quickly, while IT saves time through automation.

Staff Writer