The combination of commodity storage and compute hardware, coupled with open source software and agile networks interconnecting data centers, make the promise of big data analytics a reality and not just an overused marketing term.

Asian enterprises need to work with their network and data centre operators to virtualise the networks that interconnect them, creating a "data center without walls" that pools together physically distinct and geographically separated data centres into one powerful big data analytics compute and storage platform that is essentially limitless.

The latest advances in compute and storage technologies, when combined with improved intelligence built into the networks that interconnect them, allows data centres split across different geographies to work together seamlessly. A unified big data-centric network architecture must incorporate network programmability, scalable bandwidth, low latency, simplicity, as well as complementary network functions, such as compression, encryption, optimization, and real-time monitoring.

Much of big data available today is unstructured and thus does not fit easily, if at all, into conventional database management systems, making its analysis impossible in many cases. Open source software, such as Hadoop, allows the splitting of the data into chunks across multiple data centres, running parallel analysis, after which it can be recombined to extract a unified set of analysis results. Splitting up the big data and sending it to different remote data centers only possible through agile networks that interconnect them.

The combination of commodity storage and compute hardware, coupled with open source software and agile networks interconnecting data centers, make the promise of big data analytics a reality and not just an overused marketing term.