The brave new world of big data & Hadoop

Big data is opening up new avenues of understanding and activity for companies and the people they serve. At the highest level, this movement provides a significant competitive advantage for prescient organizations willing to invest in developing sound and actionable big data strategies.

The evolution of big data is becoming a key utility that can be leveraged as a resource both for profit and risk mitigation. On the profit side of the ledger, businesses can compile, analyze, manage, and leverage huge amounts of data that enable them to understand and serve the needs of their clients in new ways.

On the risk mitigation side, big data can be deployed to show patterns and irregularities that reveal — and even prevent — fraud and other security threats.

The stakes are high. Organizations that fail to develop effective big data strategies run the risk of being overrun by competitors who beat them to the punch, and of making themselves and their customers vulnerable to increasingly sophisticated existential cyber-attacks from predators in the market.

The Hadoop Phenomenon: Open source isn’t free

The big data movement is being driven by Hadoop, an open-source platform developed by a team of Yahoo developers and named after the stuffed elephant of co-creator Doug Cutting’s son. A host of prominent organizations, led by household names such as Walmart, Amazon, and Facebook, and a growing number of financial institutions are deploying Hadoop for a wide range of applications.

While Hadoop is an open-source platform, the process of developing and deploying it is far from free. Hadoop’s sheer complexity makes it necessary to build and maintain a significant infrastructure. Hadoop’s widespread and growing resonance in the marketplace is based on the genuine advantages the platform provides: the market’s highest performance and scalability at much-reduced costs compared with other approaches. Once this Hadoop infrastructure is built, it enables the extraction of far more data and information than can be obtained using Oracle infrastructure or other competing solutions.

How to Hadoop: Maximizing the value of big data

There are two primary ways of engaging with Hadoop.

The first, building the capability internally, seems to hold out the promise of flexibility and control for organizations that employ it. While this has sometimes been the case for some large companies, a variety of studies indicate that even among Fortune 500 companies, less than 20 percent that began Hadoop development succeeded in deploying a solution.

The second approach entails working with a big-data, Hadoop-focused third party to develop a bespoke solution. In addition to eliminating the requirement of enormous equipment and human capital investment, this approach also enables organizations, their executives, and IT staff to focus on their core value propositions rather than being forced to become Hadoop specialists.

In addition, a number of Hadoop specialists are quickly developing deep domain expertise in big data. For example, Zettaset recently partnered with Hyve Solutions and Fusion-io to create the market’s fastest Hadoop solution, and it’s 20 to 30 times faster than anything else currently available.

In our experience, working with organizations across a wide spectrum of industry areas and sizes, we have found that the best way to develop and leverage Hadoop is to start with small but significant projects. Following this path, companies need not become Hadoop specialists to fully leverage Hadoop, or to boil the ocean hiring programmers off the street or professional services to build a huge, robust ecosystem without an endgame in sight.

With a relatively modest investment in hardware, companies can leverage a Hadoop solution across the organization. The smaller footprint maximizes server efficiency and enables small-to-medium-sized companies as well as large players to participate in the ongoing big data revolution.

Freed of the requirement to download, program, manage, secure, and write code in Hadoop, companies are able to use their resources to do what they do best: analyzing data, developing business, mitigating risk, and confronting external threats.

In a wide range of high-impact businesses (including financial institutions, social media, ecommerce, oil and gas and other utilities, and healthcare) big data represents a profoundly new way of doing business. Powered by Hadoop, big data has hit the big time, and its influence will only increase exponentially.

Brian Christian is CEO of Zettaset, which delivers a fault-tolerant and highly available solution for big data aggregation. The Zettaset platform allows customers to aggregate huge amounts of data from disparate sources, perform complex analytics on the data in a parallel fashion, and visualize the results in an intuitive way while being able to scale to thousands of commodity servers.