Entering the Age of Data and the challenges this brings


Recently, I was invited to sit on a panel at a big data event and take part in a healthy debate on the subject of data.  The event brought together a selection of entrepreneurs, investors and companies who were all eager to know how harnessing, managing and analyzing ever-increasing amounts of data can give them a competitive edge.  Here is a summary of the questions asked and answers given.

actian age of data

Question: How are organizations’ analytic needs going to change as we enter the Age of Data?

Data is generated by more devices than ever before and flows constantly through more systems every single day.  In the last two years we have generated more data than in the history of mankind.  And yet, that pales in significance when you consider that data is expected to double in size every two years through 2020, exceeding 40 zettabytes (40 trillion gigabytes), which is the equivalent of 5,200GB of data for every man, woman and child on Earth.

Why and how is this important? In the Age of Data, our ability to connect emerging data, analyze it at scale, and take action in real-time means the difference between winning and losing.  For companies to guarantee future growth and success, it is imperative that they implement a technology approach today that will help them gain and sustain a competitive advantage over time.

Question: What do you see as the main challenges for data analytics in the Age of Data?

Analytics will need to respond to the sheer amount of data that is being created every second.  Everything around us is getting smarter and generating data, whether it be our smartphones, our cars, refridgerators or even our office photocopiers.  They are all gathering information and transmitting it so that the information can be used.

The main challenges can be summed up in two main points.  Can companies connect their data and can they analyze it at scale?  First, connecting the data.  With companies moving more and more to the cloud, they need to find a simple way of making their data volumes “talk” to each other, connecting systems that are on-premise as well as those that are in the cloud.  Second, analyzing it at scale.  Once the data is joined up, they will need the ability to cope with analyzing more and more of it.  Can their analytic platforms scale up and scale out?  The ability to have systems grow elastically will be as imperative as the ability to run fast analytic queries in just seconds.

However, there are two further challenges:  the democratization of analytics and getting to the pot of gold of predicting the future.  No longer will it be the case of the data scientist in the basement running analytic algorithms.  Now, companies want to empower front-line business users to pull in data, connect it together, analyze it and act upon it.

Furthermore, the Danish philosopher Kierkegaard once said that “life can only be understood by looking backwards, yet it must be lived forwards.”  This rings true for organizations who no longer want to report on what happened yesterday but get to a point where they can forsee what will happen tomorrow and take the appropriate action ahead of time.

Question: Do organizations have to throw away all their existing analytics solutions in order to work with Big Data?

While the naïve vendors would of course endorse this notion, the smart ones say just the opposite.  In fact, no, organizations will not de-install existing analytic technologies that they have worked on and invested a ton of man-years in on a whim.  What is more important is that they continue to have the right tools and the right software that allows them to meet their business goals.  If the goals are to run high-performance, fast analytics, and their current setup does not allow them to do this, then using analytic platforms as analytic offloads is one suitable approach to take.  In this example, organizations can continue to use their operational systems for their transactional-based processing requirements, but also use a complementary analytic platform to run their intensive data analytics workloads.

Question: What are the important factors an organization should consider when implementing a Big Data Analytics solution?

While data has become the asset in the corporate world, there are three main things companies should look for when considering big data analytics solutions.  First, can the solution help them to connect various data sources together, integrate and prepare them, check them for data quality and cleanliness? Second, can it let them analyze ever-increasing volumes of data at fast speeds, easily?  Third, and more important, will it help them to act on the insights and intelligence gleaned?  Will it help them get ever closer to that goal of predictive analytics, where algorithms and machine-based thinking will help humans to take the right decisions?  Ultimately, will the solution truly let them turn big data into business value?

About Sean Jackson

Sean Jackson is the marketing director for EMEA and APAC at Actian. Sean has an insatiable appetite to travel, ski, improve his photography skills, write, become a master of Adobe Creative Suite, play the piano, speak a variety of languages, and discover more about the world of big data analytics and Hadoop.

View all posts by Sean Jackson →

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>