Optimizing the WAN for a Better Cloud

Enterprises will soon be moving large amounts of data over the cloud, or so says the conventional wisdom, realizing unprecedented flexibility through global load balancing and resource scalability.

And it hasn't gone unnoticed that this will have a tremendous impact on enterprises' wide area infrastructures, meaning WAN optimization and application acceleration technologies are likely to see a big boost in demand.

Among traditional appliance-based optimization systems, many are starting to focus more closely on cloud applications. A company called NetEx has tailored its HyperIP solution as a disaster recovery tool optimized for pulling data from remote cloud storage resources. The company zeroed in on overcoming many of the packet-loss and latency issues surrounding long-distance data transfer, which can overwhelm the standard caching and protocol optimization techniques of most acceleration systems. The system uses a mix of local TCP acknowledgement, packet aggregation and TCP window adjustment to keep the data moving.

Improved WAN performance is one of the key things that will overcome skepticism of the cloud among IT executives, according to TechRepublic's Jason Hiner. His take is that WAN speeds approaching 100 Mbps in the next few years will allow enterprises to separate front-end applications from back-end ones -- hosting the front-end on a Web browser to free up space on scalable databases for the back end, plus adding an extra degree of security by requiring data to be shuttled through well-defined pipelines.

Improvements in WAN performance could also be a big gainer for VARs as cloud computing gains acceptance. As Zeus Technology's Michael Stewart tells Business Solutions, both LAN and WAN acceleration are big plusses for both service providers and enterprises looking to extend virtual networks outside their brick-and-mortar facilities. And it can be a rather tricky job to integrate an acceleration system into a virtual or cloud environment that is itself resting on a legacy network infrastructure.

But while much of the attention has focused on the ways WAN optimization can help the cloud, more recent developments are starting to suggest it could be a two-way relationship -- that current optimization technologies could see even greater gains by adopting cloud-based techniques.

Alcatel-Lucent, for example, has launched a new program that it says offers wide area optimization for cloud applications without having to install expensive technologies throughout the enterprise. The Application-Assured VPN system essentially does for acceleration what the cloud has done for storage, place the technology, and the responsibility for maintaining, on the service provider end. The service is enabled through a new card for A-L's 7450 switch and 7750 router, capable of providing acceleration on throughput ranging from 20 Gbps to 1 Tbps, with per-application deep-packet inspection and full Web-based reporting and analysis thrown in.

Since cloud technology by its nature takes the focus off of internal resources like servers and storage and places it on the network to enable higher productivity, it's only natural that improvements in network performance would start to gain in importance. And if the cloud does gain in popularity over the next few years as many experts predict, that higher traffic will make it more crucial for networks to operate at their best.

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.

Interesting article. We're struggling to capture a vendor who is focusing in this area specifically. Our VPN platform from NeoAccel talks about, optimization of TCP traffic and end to end security and the fact that as everyone accesses the Cloud from behind firewalls in remote locations the firewall itself is no longer the right tool for security and authentication (in the cloud). They maintain it's the role of the VPN to ensure the endpoint is clean, the user is who they say they are, the user can only access what he/she's supposed to access and finally to stop data leakage once data is downstream. I'm not sure if anyone else is taking about this kind of paradigm shift away from the traditional gatekeeper i.e. firewall but it sure seems disruptive if nothing else.