Big Data Adoption Too Slow in Public Sector, Govt Pays the Price

Storage solutions maker NetApp and MeriTalk, a community portal for government IT personnel, released a reported about the big data gap in public sectors – how organizations are seeing the same changes everyone else does, but are not as quick to adapt.

Over 150 CIOs and managers participated in the study, and provided some statistics that offer a deeper glimpse into the way big data is being addressed on the federal side of the aisle.

Only 60 percent of the respondents said that the agency they work for actively makes use of its data, while only 40 percent noted that the insight extracted is being implemented in the decision-making process. At the same time 87 percent said that the amount of data they store increased in the past couple of years, and 96 percent believe that it will increase in the next two as well.

Approximately 31 percent of government agencies’ storage capacity is taken up by unstructured data, according to the joint study, and on average the respondents think it will take their organization three years to start making use of it all.

The paper concluded several other things – most IT professionals believe they would need infrastructure twice as powerful to handle their data analytics requirements, and there are a couple of other issues. But the advantages are certainly being recognized:

“According to the Big Data Gap report, Federal IT professionals say improving overall agency efficiency is the top advantage of big data (59 percent) followed by improving speed/accuracy of decisions (51 percent) and the ability to forecast (30 percent).”

NetApp and MeriTalk’s conclusions surface alongside a similar study about the private sectors. One research firm has discovered that analytics and cloud are becoming a top priority for midsized businesses.