While a bunch of Actian folks were at HIMMS and TDWI Las Vegas this week, I was in DC for the IDGA Big Data in Defense and Government Summit. It was attended by some very interesting people, and I, personally, learned a lot, both about what unique challenges folks in the government are facing, and also, I learned more about how my own company is helping them. For instance, I didn’t know until I talked to the two sales reps for this area that Actian was a part of the DISA (Defense Information Systems Agency) infrastructure.
As a general rule at conferences, the first thing I do is ask folks what sort of work they’re doing and what problems they’re running into. I’ve been doing that for years with folks in a variety of industries, and without fail, I always learn something new and useful about what people in the trenches are up against. One difference I noticed in a conference with just high ranking government folks is that they tend to be a lot more close-mouthed about what sort of work they’re doing than folks doing similar things in the corporate world. But, despite that, data geeks and techies speak the same language, regardless of industry, and everyone likes to gripe about the crazy stuff they have to deal with. If I had to pinpoint one consistent challenge faced by everyone I talked to across all the different agencies and departments, it would be a need to do more with less.
“Unfunded mandates” are the bane of government IT groups across the board. Government agencies face the same explosions in sheer data volume, in needing to analyze new data sources, and in dealing with floods of sensor data coming at them fast. So, they have to handle some big data problems just to continue doing what they’ve been doing. But they also have to do what the president and congress tells them needs doing. New laws pass down changes in rules and improvements in service that agencies are required to implement, but they often don’t pass down any corresponding increase in agency budgets.
As a result of that, government agencies are looking for ways to accomplish more without spending more. With the big strides forward in recent years in open source technology, Hadoop analytics are becoming a more viable option. As little as two years ago, Hadoop was completely out of the question, advances in proprietary ecosystem products, such as security layers like Zettaset and accelerators like Actian, are making Hadoop a part of many new and developing government architectures. Open source ecosystem projects like MongoDB are also finding their way in, and making life easier for government data wranglers.
It feels a little like the early days of cloud. At some point the US government went from being completely against anything cloudy to hosting their own large scale cloud infrastructure at DISA. A similar sea change in government circles is happening in regard to Hadoop and a lot of the open source world. The need for affordable data crunching capability is Too Big To Ignore (one of the Army folks showed off Phil Simon’s book as an example of what they’re dealing with.)
US government projects (and other governments as well, I chatted with one gentleman from the Swedish government) tend to move at a much more slow, careful pace than corporate projects. This means that the cutting edge technology advancements are often not on their radar. That didn’t surprise me. What did was when someone mentioned that they can, to a certain extent, look at product roadmaps and not worry about specific current capabilities. As long as the software infrastructure capabilities will be there in a year or two, they can keep moving forward with the project. That was an interesting revelation to me. In order to plan a project that long term, the government CIO’s, CTO’s and architects have to plan way ahead. In a way, they have the polar opposite problem of companies where new projects have to make an immediate impact on profit for the next quarter in order to get approval.
While talking to the Army Lt. Colonel, I realized one other aspect that is very much like what companies face. He had an old school large enterprise style data warehouse with standard ETL and data management and Oracle BI on top of it, which means that his data was often three months old by the time he got to analyze it. And he could only do backward looking Business Intelligence, no forward looking predictive analytics. He also had problems with slow responses to new requests for information, and different visualization needs from his various data consumers. His problem might not be considered a “big data” problem, but it comes down to price/performance and speed to answers. If the data is bogging down the system, he could just get a bigger Oracle instance, but that’s neither fiscally practical, nor will it solve the need for predictive analytics and custom visualization. It won’t put the power to get answers in the hands of the people asking the questions. The Army is facing the same walls that companies all over the world are banging against. The Army is just more used to being able to knock those walls down than most companies are.
My overall takeaway is that the need in government circles is for big data analytics software that is proven, affordable, very fast, easy to use, with machine learning and predictive algorithms built-in, capable of handling all kinds of data including streaming data and low latency interactive queries, and is constantly evolving and improving. That part isn’t much of a surprise.