The obvious one is the sheer load that we are contemplating and whether it is possible to build the infrastructure to support it. By some estimates, the global load is due to rise from today’s output of about 4 zettabytes per year to more than 44 zettabytes by 2020. That’s not the total amount of data under management, mind you, but the amount the world will generate in a single year. Compare this to the average annual growth of the data center market, currently estimated at about 11 percent per year, and it is clear trouble is brewing down the line.

Perhaps a better option will be to improve data management and storage tiering, which ultimately comes down to a question of what data is safe to delete and when. Opinions vary on this subject, with some, like attorney Ralph Losey, arguing that the rise of data lakes and other advanced architectures supports the case for never deleting data at all, and others, like Martin Garland, president of Concept Searching, saying that data does not retain its value forever. Indeed, much of the Big Data load is expected to be highly ephemeral – lasting only a few seconds before decision points are lost. In either case, however, determining what is and is not valuable will require substantially better search and governance tools than exist today.

The real power of Big Data is not its size, however, but its content. And the best way to leverage that content is to bring disparate data sets together to derive hidden insights. To do that, though, you need networking. While global bandwidth is likely to support the load going forward, there will also be pockets of scarcity that could affect performance in other regions, says communications analyst Gary Kim. The Middle East and Africa, for example, are slated to see the highest growth in data traffic, about 54 percent, so it will have to see a concomitant rise in Internet exchange points (IXPs), which at the moment are going mostly toward North America, Europe and Asia. As well, worldwide network infrastructure will need to be retooled to reflect the increase in the cloud-based, data center-to-data center traffic that is emerging.

Despite these challenges, fears of the worldwide data ecosystem crashing down around our ears at some point are overblown. The fact is, Big Data will perform to the limits of its underlying infrastructure, and at some point the cost of those resources will start to reflect supply and demand. This will be a cold slap to the face for CEOs and business managers, plus the increasingly data-dependent consumer market, when they discover that they can’t have everything just because they want it – at least, not without paying a premium.

But since necessity is the mother of invention, there just may be a technological breakthrough on the horizon that pushes our data-handling capabilities to the next level. We will still do our best to max it out as soon as possible, but at least it would provide a little breathing room.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.