You are here

Edge data centers are among the hottest topics in data center management today. But what exactly is an edge data center? How can you tell if a facility is truly an edge data center, or if it’s simply a data center in an underserved market?

Unsurprisingly, it’s difficult to nail down a standard definition. That’s due in part to the relative infancy of edge data centers as an established trend. Another reason may be that data center professionals tend to define edge data centers from the perspectives of their own roles, industries, or priorities. While some may say that an edge data center is one that serves up large amounts of online content to the majority of the user population, others focus on resiliency and connectivity. Finally, defining edge computing, edge computing equipment, and even the edge itself can be a difficult endeavor, with the end results varying based on your use cases.

Edge data centers may be defined differently by different people, but they do tend to have certain aspects in common. Each characteristic can have a significant impact on how edge data centers are managed. In general, edge data centers are:

1. Local

Edge data centers are typically placed closer to the local areas they serve (typically “Tier-2” locations). The idea is to reduce latency, network traffic, and costs by bringing data closer to where it needs to be accessed while improving uptime and availability.

Their placement in remote locations and smaller metro areas means that most edge data centers are managed remotely, with little if any full-time data center staff. As with any remotely managed deployment, troubleshooting, maintenance, and visibility into the data center are key concerns.

2. Small

Edge data centers are small-to-mid-sized versions of their enterprise cousins. They have all the same components of a traditional data center but are packed into a much smaller footprint. As a result, many edge data center managers need to be cognizant of how they utilize space capacity. For example, they may need to be creative with their environmental cooling systems. If they do not have the floor space for a traditional CRAC, they may need to use wall- or ceiling- mounted cooling solutions that take advantage of vertical space instead.

To reduce spending, power and energy consumption, and required space, some organizations are turning to micro data centers—mobile “data centers in a box,” with power, cooling, connectivity, and other required components contained.

3. Part of a Larger Deployment

An edge data center may be remote, but it rarely stands alone. Most are part of a complex deployment involving a central, enterprise data center and multiple edge data centers. However, instead of connecting users back to the central data center, an edge data center serves up content on its own. Keep in mind that edge data centers aren’t necessarily used for processing data only. They’re also used to store or cache data to serve that content up faster.

As part of a larger deployment, an edge data center can be difficult to manage effectively, especially if you’re struggling with scalability. Additionally, when each data center is managed locally instead of through a centralized system, you run the risk of ad-hoc changes to equipment and processes, lack of transparency into what’s going on in each data center, security threats, and non-compliance with industry regulations and company standards.

4. Mission Critical

Despite being smaller, local, and part of a larger deployment, edge data centers are just as mission critical as their traditional counterparts. Availability and uptime at the edge are becoming increasingly important as more mission-critical capabilities—think everything from online streaming services to self-driving cars—reach not only Tier-2 markets but also remote locations.

Having your edge data center go down could result in unhappy customers and significant losses for your organization. Thus, maintaining uptime and availability by keeping an eye on your power, cooling, and network capacities is paramount when managing an edge data center.

While the definitions may be hazy, the advantages of edge data centers are not. Reduced latency, efficient capacity utilization, and support for mission-critical applications mean edge data centers are here to stay.

At Sunbird, these four characteristics form the basis for how we describe edge data centers: as data centers that house mission-critical data, applications, and services for edge-based processing and storage. Edge data center managers looking to improve uptime and drive efficiency should consider investing in Data Center Infrastructure Management (DCIM) software, a critical enabler of success at the edge.

Want to learn more about data centers at the edge and how to manage them more effectively with DCIM software? Check out our President Herman Chan’s recent article in Data Center Frontier: 5 Edge Data Center Management Challenges.

Share the wealth.

About DCIM and Data Center Infrastructure Management Software

Today's data centers have increased in size, density and complexity. Managers need an easy to use Data Center Management System to improve efficiency through data center optimization, extending the useful life of their existing physical infrastructure, while ensuring uptime. A new category of tools with integrated processes, DCIM Software, combines the capability of tracking assets with the coordination and validation of managing space, power, data center cable management and cooling.