Analytics guru Thomas Davenport says that the field of strategic analytics has been upped a notch to “3.0” with the recent trend of shifting the use of analytic methods from improving internal operational efficiency to improving the operation of products and services.

Thomas Davenport, in a recent Harvard Business Reviewarticle, says “Some of us now perceive another shift, fundamental and far-reaching enough that we can fairly call it Analytics 3.0.” What does this mean for leaders of large organizations?

The Three Phases of Analytics

Davenport writes that the field of analytics has evolved during the past 60 years in three phases:

Analytics 1.0 was born in the mid-1950s and was referred to as “business intelligence.” It gave managers “the fact-based comprehension to go beyond intuition when making decisions,” Davenport says. It involved examining data from production processes and customer interactions. He notes that new computing technologies were key and were often custom-built.

Analytics 2.0 emerged in the mid-2000s when Internet and social media companies -- Amazon, Google, eBay, etc. -- “began to amass and analyze new kinds of information,” he says. This was referred to as “big data.” Davenport says big data differs from small data because big data comes from sources outside the company and were not generated solely by the organization’s own internal systems. It comes from sensors and social media as well as video and audio recordings. It could not be stored on a single server. Much of it has to be stored in the cloud.

Analytics 3.0 involves collecting data on every activity associated with your products, services and customers, because “every device, shipment and consumer leaves a trail,” he says. This is not done just by information companies but by every organization. Davenport says organizations “have the ability to embed analytics and optimization into every business decision made at the front lines of your operations.”

In the Private Sector

Davenport offers several examples of large companies that he believes have made the leap to Analytics 3.0:

General Electric not only builds engines and medical devices, it embeds sensors in them so users can optimize their capabilities. Davenport says: “With sensors streaming data from turbines, locomotives, jet engines and medical-imaging devices, GE can determine the most efficient and effective service intervals for those machines.”

UPS has installed telemetric sensors in its fleet of 46,000 delivery trucks that track speed, direction, braking and drivetrain performance. Using these data, UPS redesigns drivers’ routes to cut 85 million miles out of their travel distance, saving both time and fuel costs.

Schneider Electric, a French energy management firm, handles energy distribution in utility companies. With its devices, the utilities are able to “integrate millions of data points on network performance and lets engineers use visual analytics to understand the state of the network,” Davenport says.

In the Public Sector

As in most trends, government is often behind. While Davenport’s article focuses on the impact of Analytics 3.0 in the private sector, he has written previously about the strategic use of analytics in government, and it is not hard to see the transition of Analytics 3.0 to the public sector. For example:

City and regional transportation departments are creating intelligent transportation systems that use remote cameras and cellphone data to determine traffic conditions and adjust the timing of traffic lights and pre-position emergency equipment.

Public hospitals and the Veterans Health Administration are using mobile devices to allow doctors and patients to update and share health data in real time, so treatment decisions can be made in minutes instead of days. They are also embedding sensors in patient protocols and hospital supplies to track their use and replacement.

The Homeland Security Department is planning its next steps to help states and localities to employ real-time continuous monitoring of computer networks to detect and deter cybersecurity events.

Next Steps

Davenport offers a series of steps managers should take to refocus their organizations in order to take advantage of Analytics 3.0. These include actions such as designating chief analytics officers and developing new methods of decision-making and managing. Many of his steps are also reflected in two recent reports on building an analytics culture and lessons from early programs by the IBM Center for the Business of Government. Davenport concludes by noting that the push for big data has been a huge step forward, but that in the new data economy, organizations must “once again fundamentally rethink how the analysis of data can create value for themselves and their customers.”

John M. Kamensky is a Senior Research Fellow for the IBM Center for the Business of Government. He previously served as deputy director of Vice President Gore's National Partnership for Reinventing Government, a special assistant at the Office of Management and Budget, and as an assistant director at the Government Accountability Office. He is a fellow of the National Academy of Public Administration and received a Masters in Public Affairs from the Lyndon B. Johnson School of Public Affairs at the University of Texas at Austin.

By using this service you agree not to post material that is obscene, harassing, defamatory, or
otherwise objectionable. Although GovExec.com does not monitor comments posted to this site (and
has no obligation to), it reserves the right to delete, edit, or move any material that it deems
to be in violation of this rule.

Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.