Free Your Data, Free Your Mind

Image: Image Editor/Flickr

There is a well-known cliché within project management, “Fast, good, or cheap: pick two.” As a systems integration consultant early in my career, I found that this saying seemed to hold true for enterprise IT projects. In practice, managers were faced with very painful tradeoffs: Cheap and good? Project delays are imminent. Cheap and fast? You are getting a buggy prototype, not a real system. Good and fast? Prepare to take your budget and double it. Then do it again.

The reality is actually worse. Many enterprise development teams are often unable to achieve even two of the three options — their results aren’t fast, aren’t good, and aren’t cheap. Managers make significant compromises that will ultimately lead them into insurmountable traps.

As projects often go over budget and scheduled timetables, teams are blocked from making adjustments when delays occur. This leads to projects simply steaming along, amassing cost and time overruns on a daily basis. Managers are then forced to compromise on the project to the point where they end up losing quality, cost savings, and the timely delivery of a project.

Over the past decade, companies have tried a variety of solutions to solve this problem. For example, if data management becomes too expensive, then the solution is to move it to the cloud. If the approval process is taking too long, then introduce shadow IT. If applications are moving too slowly, then turn to SaaS. The problem with this approach, however, is that it fails to get to the core of the problem — the data.

There is a damaging misconception in IT that equates data cost with storage cost. Since storage is getting cheaper, the cost of data must also be dropping. However, there are large hidden costs of data, and these drag down every important IT project. CIOs must take into account all factors associated with data management in order to break free from this logjam.

First and foremost, data is heavy — it is very difficult to copy and move to the cloud. This causes project delays on everything from not having the right data set at the right time, to the impact of poor data on quality testing, to “flying blind” and using old data. In order to fix this mistaken belief, CIOs must start viewing data as the key to fast, cheap, and quality applications.

Fortunately, the timing of this shift in perception has been paired with emerging technologies that address the ways in which companies currently manage data. The advent of virtualization transformed the data center by turning heavy, expensive servers into software. Innovative virtualization techniques have taken this process a step further by setting this heavy data free.

One of the largest online ticket sellers in the U.S. added three weeks to every project, simply to prepare its data. Each project also included an average 20 percent schedule buffer to offset delays in updating test data. The company planned to embrace quality at the expense of speed. However, as data volumes grew, the company was forced to test with samples instead of the actual data, resulting in more bugs leaking into production. In the process of trading speed for quality, the project also lost quality. And of course, delays and outages meant sacrificing cheap as well.

However, by applying new virtualization techniques to its databases, that same online ticket seller was able to eliminate the 20 percent buffer, the three week lead time and almost 20 percent of production bugs, all at a lower cost.

Another firm, a Fortune 500 manufacturer, optimized for lower project costs by outsourcing the operations of its SAP systems to a large service provider. The contract limited what the outsourcer could do, so the firm was actually trading off fast in order to receive cheap and, hopefully, some good. In practice, the contractual delays in providing data to the firm’s application teams and business managers meant that the firm missed opportunities. It lost good and, as a result of regularly paying for special exceptions to the outsourcer, it also gave up cheap.

After its systems were modified using emerging virtualization techniques with its key databases, this same firm found that it was able to cut over $10 million from its outsourcing costs while cutting out 65 percent of the time required to refresh its reporting and operations data.

These examples will sound familiar to any IT executive in any large enterprise. CIOs faced with forced compromises in the hopes of optimizing operations oftentimes end up losing on all fronts. However, there is no reason for important projects to fail more often than they succeed. CIOs must readjust their mindset about data as the pathway to meeting all three objectives.