HuffPost Pollster API Enables Public Access to Polling Data

HuffingtonPost has recently released the HuffPost Pollster API that enables open access to all the polling data on the website. Software developers use this API to access information about public polls and publish the same on the Huffington Post. This will be of more interest for the public as the presential race is heating up, and people are quite interested in political polls to understand the current opinions of the electorate. The initial release of the Pollster API includes data from more than 215,000 responses to questions on a broad range of subjects, 13,000 polls that HuffPost organized by subject and geography into more than 200 charts. Here’s what HuffPost has to say about the poll API mechanism:

“Since being able to understand the methodology behind opinion surveys is an important step toward increasing transparency in the opinion polling industry, we’re including information about the methodology for each poll. And to make the data independently verifiable, we’ve included a link to the original source that conducted or reported the poll along with each entry. We currently calculate these by running a locally weighted polynomial regression on every poll for a specific category of question. But we’re continually improving our methodology for combining the information from different opinion polls into a single estimate”.

The release of Pollster API is all about pulling together people and data transparency, enabling people to get a better idea of what people think (by asking them). It also means that other writers, researchers, and data-visualization producers can generate graphs, charts, from HuffingtonPost data.

This ties in nicely with Big Data visualization as the Pollster API mechanism will utilize Big Data to visualize and display data for public use. Web data visualization is one of the significant benefits offered by Big Data as it boosts infographics, graphs, charts, and other data visuals. Since public institutions generate a great deal of data, and generally run on statistics, it is really useful for them to release that data not just to the media but to allow the public to digest it directly. One of the best examples in recent times is the Google’s cloud-based web data visualization interface for the everyman in the Google Public Data Explorer. It currently supports 27 data sets and more than 300 metrics, right from the labor productivity and Internet speed to gender balance in parliaments, government debt levels, and population density by municipality. The public data explorer was launched back in February, with the following statement from Google:

“Today, we’re opening the Public Data Explorer to your data. We’re making a new data format, the Dataset Publishing Language (DSPL), openly available, and providing an interface for anyone to upload their datasets. DSPL is an XML-based format designed from the ground up to support rich, interactive visualizations like those in the Public Data Explorer.”

“Some of the most important technology programs that keep Washington accountable are in danger of being eliminated. Data.gov, USASpending.gov, the IT Dashboard and other federal data transparency and government accountability programs are facing a massive budget cut, despite only being a tiny fraction of the national budget. Help save the data and make sure that Congress doesn’t leave the American people in the dark.”

This shift from data transparency from the government will definitely alter the way the public can understand the inner workings of the State. It fuels the engine of discovery for citizen journalism, public self-education into government spending and statistics, and even gives ordinary citizens thinkers and bloggers access to what traditional newspapers would spend thousands of dollars on to educate them. Without this information, services such as Google’s Public Data Explorer visualization tool and IBM’s City Forward tool would be much less useful.

Premium Research

Wikibon argues strongly against Revolution towards a 3rd platform. The conclusion from this analysis is that applications will evolve; conversion should be avoided like the plague. The greatest opportunity is to continuously adapt today's operational applications by the addition of real-time or near real-time analytics applied directly to the current organizational processes and applications that support these processes. This is likely to translate to the greatest value to most organizations, and where possible avoid the risks of converting systems. The study of organizations that have applied real-time analytics to their current operational systems have shown incredible improvements in lower costs and greater adaptability. Business and IT executives should understand the enormous potential for adding decision automation through real-time analytics to current operational applications in their organizations. New technologies should be judged by their ability to support real-time analytics applied to operational systems, and supporting incremental improvement over time.

In a recent web-based survey conducted by Wikibon, 300 North American enterprises whom had either been utilizing, or considering the adoption of public cloud, answered questions regarding IaaS (Infrastructure as a Service) perceptions and usages. These questions varied in topic but were centered around an examination of which workloads were best suited for usage in the public cloud. This research examines a few additional key insights that shed some light on the growing IaaS world.

Today's Technology infrastructure management is largely non-differentiated and wasteful. Technology executives must re-think the strategic role of human capital and begin to implement new ways to consume IT as a service. This post draws on the learnings of senior executive Alan Nance from Royal Philips who is dogmatic in its approach to transforming its infrastructure to a service model.

There have only been two successful volume introductions into the marketplace in the last 50 years - DRAM and NAND flash. There has to be a clear volume case with good economics for 3D XP to be able to gain a foothold in consumer products. Without volume in the consumer space, there is unlikely to be much volume traction in the enterprise space. CIOs, CTOs and enterprise professionals should take a wait and see stance, and monitor the adoption of 3D XP in the consumer and military spaces. If and when there is volume production for 3D XP, enterprise adoption should start about two years later.

The use of open source software continues to accelerate and expand in the marketplace, especially in areas where technology is significantly disrupting established business models. IT organizations should be actively seeking to understand how open communities operate, how different licensing models work, and how they can be more actively engaged with both the vendors and communities that are shaping open source software.

CIOs understand that a clear cloud strategy is critical for IT today. Wikibon believes the biggest mistake organizations can make is converting major applications into the public cloud (including SaaS) without thinking about the implications to their existing business process workflows. Wikibon recommends that IT develop and implement a hybrid cloud strategy using the existing management workflows and compliance processes for both the public and private cloud components in the hybrid cloud.

In 2014, Wikibon defined a new category "Server SAN" that sits at the intersection of software-defined storage, hyperscale methodologies and converged infrastructure. This article is the executive summary of primary research that gives the status of the market, examines the vendor ecosystem, lays forth the revenue and 10 year forecast and gives direction for expansion beyond simple "hyperconverged infrastructure". This information is available for public consumption, the full research is available to Wikibon clients.

In this research paper, Wikibon looks back at the introductory Server SAN research, adjusts the Server SAN definition to include System Drag, and increases the speed of adoption of Server SAN based on very fast adoption from 2012 to 2014. The overall growth of Server SAN is projected to be about a 23% CAGR from 2014 to 2026, with a faster growth from 2014 to 2020 of 38%. The total Server SAN market is projected to grow to over $48 billion by 2026. The traditional enterprise storage market is projected to decline by -16% CAGR, leading to an overall growth in storage spend of 3% CAGR through 2026. Traditional enterprise storage is being squeezed in a vice between a superior, lower cost and more flexible storage model with Enterprise Server SAN, and the migration of IT towards cloud computing and Hyperscale Server SAN deployments. Wikibon strongly recommends that CTOs & CIOs initiate Server SAN pilot projects in 2015, particularly for applications where either low cost or high performance is required.

If containers are at the center of a shift in how applications are developers and delivered, and their pace of growth and change is unprecedented in IT history, this could have a massive ripple effect on both suppliers and consumers of the ecosystem of IT technologies.