What is an Edge Data Center?
The recent explosion of the Internet of Things (IoT) and 5G networks have enabled the creation of new cloud-based applications across a wide range of industries. However, many of these applications – think driverless cars or wearable medical devices – require high-speed processing, low latency, and high bandwidth to operate correctly for end users. Edge data centers provide a way for content providers to move much of the processing to the edge of the network – or, in other words, closer to the user.
What is an Edge Data Center?
Edge data centers are decentralized facilities, equipped with power and cooling infrastructure, that provide computing and storage in a location closer to where data is being generated and/or used. As such, they store, process, and analyze data around the end user’s location instead of routing traffic to the nearest major market to be processed at a regional or cloud data center.
In terms of location, edge data centers are deployed as either standalone facilities or in several different environments, such as at telecommunications central offices, cable headends (i.e., local distribution points), the base of cell towers, or on-premises at an enterprise.
To this end, they are smaller facilities, located closer to end users than regional data centers – which are large facilities in close proximity to urban population centers – and cloud data centers – which are massive, centralized, and remote facilities in areas where land and power are relatively inexpensive. Edge data centers are typically connected via fiber to the larger regional and cloud data centers.
Why do we need Edge Data Centers?
Latency has always been a problem for data center managers, but in recent years it’s become a critical concern due to big data, the Internet of Things, cloud and streaming services, and other technology trends. End users and devices demand anywhere, anytime access to applications, services, and data housed in today’s data centers, and latency is no longer tolerable. As a result, organizations across many industries are establishing edge data centers as a high-performance and cost-effective way to provide customers with content and functionality.
How does it work?
The edge data center functions as a connection between multiple networks, where they become an internet exchange point for a requesting device (e.g., mobile phone, laptop, etc.). In essence, edge data centers become a conduit for multiple network and service providers to access localized compute resources, especially for cloud-driven functions like edge computing and machine learning (ML).
They are located closer to the users and their devices that collect and transmit data, or wherever data is generated. Typically, they are powered by edge caching – hardware or software-based components that temporarily store data to increase computing response time. Often, these components appear as micro-datacenters (MDCs), a modular system designed for workloads that can occur outside the centralized data center and can be scaled for specific needs. MDC components can include mobile so-called fog computing, which uses the cloud and data storage infrastructure to move data to preferred areas, or mobile edge computing (i.e., cloudlets), small cloud data centers designed for mobile applications and devices.
Ultimately, edge data centers are designed to turn collected data into usable insights, whether that’s enabling automated capabilities or processes like cyber security and threat analysis or gaining insight into device or infrastructure performance.
Some benefits
Provides faster service and increases bandwidth over traditional data centers
By locating the data center closer to users, data can be serviced faster than would otherwise be available in a larger remote data center. This is because the data has to travel a shorter distance and likely through fewer network devices such as switches or routers. In addition, edge data centers often have to process smaller portions of a network’s overall data, so the computation is usually faster as well.
Enables resilient networks
Traditional data centers are typically multi-tenant, where many users or businesses house their servers in the same space to cut down on costs, or enterprise facilities that support just one organization. The drawback of these centralized nodes is that downtime or interrupted service often results in application outages costing millions. Edge data centers, on the other hand, create a mesh coverage where the downtime of one data center will be covered by other edge locations. Edge applications and end users are the beneficiaries of this improved resiliency.
Cost-Effective
By tailoring edge systems to exactly the needed workflow, they don’t necessarily require as much hardware or maintenance as a more traditional data center. This can reduce a lot of unnecessary overhead and make it much more cost-effective to expand the network to meet demand incrementally.
Customizable
The modular nature of edge data centers not only affords cost-effectiveness but also allows for fine-grained customization based on the expected application. For example, an IoT device that processes images may require more GPUs than CPUs, which is typically not standard for a traditional data center.
Scalable
By keeping the edge data centers small and modular, the network can expand as needed based on demand. As long as the software is built using a distributed systems architecture – where components operating on different networks or platforms can still communicate with each other – scaling simply requires installing edge data centers into the network; the rest of the system can effectively stay the same. Here, the most important aspect of deployment is the ability to add data center units from a manufacturing capability.
Use cases for Edge Data Centers
Internet of Things (IoT) Applications
The value of the IoT lies in its ability to support real-time asset monitoring, intelligent automation, and operational intelligence. Edge data centers provide the resources needed to process and analyze IoT data near the source for faster decision-making.
Virtual Reality (VR)
Just about any physical process can be replicated in VR. For example, retailers can use smart mirrors to enable customers to “try on” clothes without a dressing room. Creating a realistic virtual environment that allows users to interact with it naturally requires substantial processing power and high-speed data transmission. Edge data centers can provide those resources.
Artificial Intelligence
AI models are often developed and trained in a centralized data center or the cloud using historical data, then pushed to the edge to infer current data. Inferencing requires less computing power than training but needs lower latency. Edge AI has applications in many industry sectors – in manufacturing, for example, AI can analyze data collected from sensors to improve quality, reduce waste, and drive costs down.
Regulatory Compliance
Government and industry regulations are increasingly concerned with data sovereignty – data must be maintained in a geographic location near the user to meet privacy requirements. Storing and managing data in an edge data center can facilitate regulatory compliance while enabling more efficient delivery of services to users. For example, retailers can collect and analyze data from the Wi-Fi network and provide personalized content without violating privacy laws.
Characteristics
Most edge computers share similar key characteristics, including:
- Location. Edge data centers will generally be placed near whichever end devices they are networked to.
- Size. They will have a much smaller footprint while having the same components as a traditional data center.
- Type of data. They will typically house mission-critical data that needs to be low latency.
- Deployment. An edge data center may be one in a network of other edge data centers or be connected to a larger, central data center.
The Role of DCIM in an Edge Data Center
There are many challenges specific to edge data center management, such as being able to direct technicians to complete changes properly, monitoring data center health across multiple locations, and managing all assets and their connections across the entire data center deployment. Having to manage edge data centers remotely usually involves a combination of multiple remote management tools, analytic capabilities, and databases. This can easily lead to inaccurate data, incorrect work orders, and poor decision-making.
Data Center Infrastructure Management (DCIM) software provides data center managers with a central system where they can view the assets, power, connectivity, cooling, and physical security across multiple locations and accurately make changes to their data centers wherever they are located. The remote management and business intelligence capabilities of DCIM software help edge data center managers achieve their goals of reducing latency while maintaining availability and uptime.