Technology

Analytics
The word "analytics" is on everyone’s mind these days and has suddenly become popular. For some the difference is that "analytics" activities look forward and help inform predictive decisions while “business intelligence” looks backward. For others, the terms are interchangeable. Some think about one or the other in terms of hardware and software while others think first about processes and principles. Does this matter?

Definitions and implementations matter. Enterprises and organizations are complex systems and the way we talk about systems matters. Complex systems require a deeper understanding of its landscape and how things "play" together (i.e., relationships and relevance). All systems have behaviors and rules. An analytics approach is to recognize what structures contain what latent behaviors, determine what conditions release those behaviors, and where possible, rearrange the structures and conditions to reduce the probability of destructive behaviors and to promote the beneficial ones.

Analytics is not:

Analytics is not Business Intelligence.

Analytics is not Data Warehousing.

Analytics is not Reporting.

Analytics is not about OLTP or OLAP Cubes nor is it about Databases.

Analytics strives to be:

Analytics is making sense of raw data, identifying patterns and recommending actions.

Analytics is as close as one can get to Management Consulting frameworks.

Analytics is about using all of the existing reports to get a story.

Analytics is about using every tool available to gain greater knowledge.

Analytics is about insights that "Change the Business" and rarely about "Running the Business".

Analytics is about enabling customer insights and data-driven decision making instead of purely intuitive based decision making.

Analytics is a tool that helps detect the expected and discover the unexpected. Analytics helps form higher-order classifications and patterns using goal seeking and analytical methods. Adding the ability to digest extreme data in (near) real-time for significant communications and strategy support is how Extreme Analytics is evolving.
Having Extreme Analytics grasp the digital universe is certainly a monumental task but possible today.

Big Data
Big data is best defined when an entity has a greater amount of data than their current technology can handle. For a small business this may mean their single work station cannot handle the increased data in a timely manner, or for a large power company, it could mean the multimillion dollar technology investment is insufficient to handle the increased collection of data and the requirement to transform data into knowledge.
Scalable Analytics research compared data penetration by sector to determine how each sector compared on a high, medium or low rating and then ranked the various sectors by analytic potential using five variables which included overall ease of use, talent to use the data, IT intensity in that sector, whether the sector had a data driven mind set and finally data available.
Manufacturing and utilities both ranked the highest in Data Penetration by Sector and also ranked the highest in Big Data Analytic Potential. This means that both sectors could benefit potentially from the Scalable Analytics technology.
Scalable Analytics research confirmed the need to focus on the utility sector due to the increased amount of investment and spending that is ongoing as compared to the investment and spends in manufacturing.

A supercomputer is a computer at the frontline of current processing capacity, particularly speed of calculation and rate of data throughput.
Supercomputing (often referred to as high performance computing or HPC) brings together computers, software and expertise to solve problems that are too difficult to solve effectively by any other means. There are many ways to apply extra resources in order to get workloads and workflows done. Supercomputing does this using many computers, sometimes tens of thousands of them. Supercomputing usually works by breaking problems (i.e., Extreme Analytics in our case) into pieces and working on those pieces at the same time (i.e., in parallel).

Supercomputing
Supercomputing was once the province of big science and government laboratories. No longer true are the barriers to supercomputing there were prohibiting an enterprise or organization from its use.
Today, supercomputing directly impacts our lives in ways we seldom recognize: product packaging (e.g., water bottle design for strength and transportability), materials design (e.g., golf ball performance) and even children's products (e.g., better diapers). Many things are better today because of supercomputing.

Throwing lots of extreme data at lots of teraflops on a supercomputer will give you results, but it's like cracking a cryptographic message: you could find a decryption that finds "Shakespeare" out of any message, but it’s probably not the right answer. The key is to deploy and use an integrated, scalable analytics solution with best-of-class algorithms, world-class performance and easy-to-use for extreme data.

Use CasesSmart Grid data is outpacing our ability to analyze it. Scalable Analytics was built to scale to accommodate real-time grid analytics on extreme data.READ MORE