TRENDING

Agencies hot for big data, but plans, resources are lacking

By Rutrell Yasin

Jun 19, 2013

Big data analytics has the potential to save nearly $500 billion — or 14 percent of agency budgets — across the federal government, and 69 percent of federal IT executives in a new survey say deriving more value out of the troves of data will increase efficiency, foster smarter decisions and deepen insights.

However, just 31 percent say their agency has an adequate big data strategy, according to the survey of 150 federal IT executives by MeriTalk.

Sixty-nine percent of the surveyed IT executives think deriving more value out of the troves of data stored in agency databases will increase efficiency, foster smarter decisions and deepen insights, according to the report “Smarter Uncle Sam: The Big Data Forecast,” which was underwritten by EMC Corp.

Big data is defined as a volume, velocity and variety of data that exceeds an organization’s storage or computing capacity for accurate and timely decision-making. Public-sector agencies have worked for years on complex, analytic projects in many domains before the term ‘big data’ came along. What has changed, according to industry experts, is that the cost of computing has come down, unlocking capabilities for agencies to analyze and find hidden value in data.

As a result, agencies are putting in place the building blocks to tap into the hidden value of big data. For instance, nearly one-fourth of the federal IT executives have launched at least one big data initiative, investing in IT systems and solutions to improve data capture, processing and storage as well as identifying challenges that big data can solve, according to the report. Federal agencies are spending money on technology to:

Increase server storage capacity to house and analyze big data

Determine bandwidth needs for big data storage and analytics

Deploy advanced data mining practices

Sequestration budget cuts pose a significant risk to launching new big data programs, federal IT executives said. Forty-one percent are experiencing budget cuts of more than 10 percent as a result of sequestration. The executives identified several sequestration casualties, including: training and workforce development (51 percent), hardware upgrades (48 percent), software upgrades (41 percent) and new applications development (40 percent).

The findings align with other surveys focusing on the use of big data analytics in government and business. A TechAmerica Foundation study of nearly 200 public sector IT professionals released in February indicated that both federal and state IT officials think big data analytics can have real and immediate impacts on how governments operate, from helping to predict crime to cutting waste and fraud.

The survey, commissioned by SAP AG, and conducted by pollsters Penn Schoen and Berland, cites the potential of big data analytics to improve lives and save money. But federal and state governments are actually deriving value from specific projects.

For example, big data and text analytics are being applied to agency projects that deal with large amounts of unstructured data, such as NASA’s analysis of airline safety reports and a Homeland Security Department-funded bio-preparedness collective. Newly available data on lightning activity within clouds gives the National Weather Service, NASA and the military better warnings about severe weather.

Predictive analysis is growing as a crime-prevention tool, as city police departments such as those in Baltimore and Philadelphia use analytic tools to parse large volumes of data to forecast patterns and prevent crimes. Analytics is also fueling the Air Force’s efforts to improve patient care and support research into preventive medicine and disease management, making it easier for clinicians to comb through data to find meaningful insights.

Still, many organizations cannot gather business insights from big data fast enough to quickly make informed decisions, according to a recent survey of 200 business professionals in large organizations conducted by IDG Research Services and Kapow Software. More than 85 percent of business and IT leaders agreed that big data offers substantial value in its ability to make more informed business decisions and foster a data-driven organization. However, more than half of the respondents (52 percent) rate big data project success so far as lukewarm (or somewhat successful) and only 23 percent of them perceive big data projects to be a success. The inability to automate structured and unstructured data quickly and effectively is among the biggest challenges with 60 percent of the respondents noting that big data projects typically take at least 18 months to complete.

inside gcn

Reader Comments

Fri, Jun 21, 2013
Mariela
United States

Very interesting, Rutrell!
The trick with big data is to start small. For businesses, it is crucial to identify key areas of concern and monitor them over time. Too much data can lead to uncertainty if what is being collected is not specific to your key concerns. Rarely do companies have 100 percent of the data they need to make decisions, and having too much data can stand in the way of making the right decision.
clearCi recently published a very insightful white paper titled “How to Maximize Awareness and Minimize Uncertainty” that tackles big data and information management.
You can access the white paper here: http://clearci.co/11D5dwo

Please post your comments here. Comments are moderated, so they may not appear immediately
after submitting. We will not post comments that we consider abusive or off-topic.