Big Data, Fast Infrastructure

No doubt your enterprise is amassing loads of data for fact-based decision-making. Hand in hand with all that data comes big computational requirements. Can traditional IT infrastructure handle the increasing number and complexity of your analytical work? Probably not, which is why you need a backend rethink. Big data calls for a high-performance analytics infrastructure, as Fern Halper, a partner at the IT consulting and research firm, Hurwitz & Associates, discusses here.

Quite an exciting and informative video, Fern. Pervasive requirements and instrumentation create vast amounts of data & drive new types of applications that require such infrastructure in place to support data analytics.

As in prior discussions this video blog helps explain how HPC can help managers further analyze and categorize data they need quickly. Again HPC looks like it may be one of the big data solutions of the future.

Interesting the explanation of HPC again as a solution to cope with big data, and appropriate given the complexity of information being studied. Great video that looks at the how's and why's of the issue as well as helping us visualize how all of this comes together within the enterprise.

@Fern, great video. Thanks for giving us the lowdown on HPC and HPA. One question for you -- we hear about in-memory analytics from time to time. Where does that fit into the big data analytics scenario?

While 97% of insurers say that insurance fraud has increased or remained the same in the past two years, most of those companies report benefits from anti-fraud technology in limiting the impact of fraud, including higher quality referrals, the ability to uncover organized fraud, and improve efficiency for investigators.