According to Svetlana Sicular, Research Director at Gartner, Big Data is Falling into the Trough of Disillusionment. Basically, what that really means is that people are starting to understand that big data analytics solutions are not going to bring about world peace, make everyone instant billionaires, and cure the common cold. Don’t Cry, though. This doesn’t mean the promise of big data is all Dust N’ Bones or Double Talkin’ Jive. What this really means is that we have a chance now to get past the hype, and find the real kernel of truth underlying the bigger-than-life myth. Guns N’Roses may not have known anything about big data, but they almost had the right idea.
This disillusionment is an opportunity for savvy companies to make a leap forward. If you think that the overwhelming media focus on new big data analytics methods and technologies is just a Bad Obsession, well You Ain’t the First. But the thing about the trough of disillusionment is that it leads eventually to the plateau of productivity, and that’s where businesses get the real benefit from new technology. When people begin to understand the limitations of new tech, that’s when they Get in the Ring and start really putting it to good use.
Neil Capel on Wired hit the nail on the head in his article: Smart Data: For Communications, Big Data Becomes Transformational.
To get a little more specific, when I talk about big data tech, I’m usually thinking of Hadoop. There are other options, of course: appliances, HPCC, even traditional databases scaled to the extreme. The biggest advantage of Hadoop over the other options is one that matters to every business: cost.
On the Harvard Business Review Blog Network, Paul Barth and Randy Bean produced an excellent article on the results of a survey by NewVantage Partners: Get the Maximum Value Out of Your Big Data Initiative. There’s a lot of excellent information and advice in the article, but the one thing that clearly shows why Hadoop is the dominant big data technology in practical applications is summed up in this graph:
Yes, Hadoop can process a lot of data for a reasonable price. The thing is, and prepare to be shocked: Hadoop is not a panacea. It can’t do absolutely everything really well. This is where the disillusionment starts to set in, because if you believe all the press, you’d think Hadoop could cure world hunger and wash your socks. Don’t get me wrong. There is huge potential in the Hadoop model, it’s just never going to live up to the kind of insane press it’s been getting. Nothing could.
The MapReduce programming framework is the source of some of Hadoop’s limitations. Vincent Granville did an excellent article recently on Analytic Bridge with some specifics on What MapReduce Can’t Do. Once YARN breaks the Hadoop operating environment free of MapReduce restrictions, I expect an explosion of applications and frameworks designed to run on that environment, each specialized to do something really well on Hadoop. Instead of trying to do everything with one tool, many tools will be created, each ideal to one specific aspect of big data processing.
The real advantage of this new technology, aside from the fact that it can handle massive volumes of data without choking and dying, or killing your operating budget, is speed. We are not a patient society. The cutthroat world of business has become all about who can get the right answer faster.
If I can place a correctly targeted ad on a Web page two seconds before you can, chances are, I made the sale and you didn’t. If I can catch cybersecurity intrusions two seconds faster, I might save my company millions. If I can predict when a switch is overloaded two seconds before it fails and re-route, so that my calls drop less than my telecom competitors, I get thousands more loyal customers. Two seconds can be an eternity these days.
From that same HBR blog article by Paul Barth and Randy Bean that had the relative cost graph comes a quote that really brings the point home:
That last bit is the kicker. Welcome to My World. It’s pretty much what I was trying to say in my SmartDataCollective article on Predictive Analytic Strategies to Out-Predict the Competition. Big data technologies applied to normal analytics workloads become predictive analytics accelerators. Big data technologies applied to real modern business analytics needs, give you the answers you need faster, no matter how big your data is.
But that’s all Hadoop and the other new big data technologies do.
They can’t stop North Korea from testing nukes. They can’t predict the lottery numbers for you. And they can’t get your teenage son to put the seat down on the toilet. Get past the disillusionment with what they can’t do, and use your disillusion to finally reach the potential of what big data technologies genuinely can do for your company. Or, you might just find your business Knockin’ on Heaven’s Door.