Getty Images/iStockphoto

Tip

Explore edge computing services in the cloud

Learn what you need to know about edge computing and how it compares to cloud computing. Also, discover related management and deployment services from top providers.

With edge computing, organizations can achieve the fastest performance possible without steep data transfer costs and security risks.

Amazon Web Services, Microsoft Azure and Google Cloud offer services to make it easy to deploy workloads at the edge, while still integrating them with the providers' respective cloud platforms. Other platforms, such as Red Hat OpenShift, may also be viable solutions to set up and manage edge architectures.

Let's explore how edge computing works, its relation to the cloud and what major vendors offer for edge computing deployment and management.

What is edge computing?

Edge computing is the deployment of storage and/or compute resources close to the data source or user. When organizations use edge computing, they deploy workloads in physical locations that are near the places where data is produced or consumed.

For example, a retailer could set up an edge computing infrastructure to host payment processing applications within physical stores. That way, the payment data that the applications need to collect and process would be available on the same local infrastructure where the applications are hosted. This reduces the time required to process the data since the data doesn't need to be transferred to a remote data center first.

Edge computing vs. cloud computing

Technically speaking, any on-premises workload can be considered a form of edge computing. The same could be said for other technologies, such as content delivery networks, which also place data closer to end users.

However, the concept of edge computing only became popular following the shift toward cloud-centric infrastructures starting in the late 2000s. Edge computing can solve some of the core performance, reliability and security challenges associated with the cloud.

Compare edge cloud vs. cloud computing vs. edge computing
Compare edge cloud, cloud computing and edge computing to determine which model is best for you.

Latency

Cloud computing platforms require data to be transferred over the internet before being processed or stored within cloud data centers. The data may also need to be sent back out over the internet to end users after it has been processed. Because the internet is slow compared to local networks, it usually has higher rates of latency.

In the cloud, it would take a few seconds for an application to receive a user's request, process it and send back the result. On edge infrastructure, response times can be reduced to mere milliseconds by not relying on the internet.

Reliability

If a cloud data center becomes unavailable, so do the applications and data hosted in it. Although such outages are rare, they can happen due to network routing problems or damage to a cloud provider's infrastructure.

Edge computing offers the advantage of being able to store critical data locally, where it is more available to the workloads that need to access it.

Security

Data transferred between remote locations and cloud data centers is exposed to potential security risks when it moves over unsecured public networks. It may also be at higher risk sitting in a public cloud provider's data center.

By minimizing the need to transfer data across public networks, edge computing minimizes potential security exposure. Also, edge architectures give businesses the option of storing data on a local infrastructure, where security uses the public cloud.

Edge computing services in cloud

Each major cloud vendor offers multiple solutions for organizations that want to build an edge architecture and integrate it with public cloud services. These offerings fall into three main categories: hybrid cloud platforms, network optimization, and IoT deployment and management.

Hybrid cloud platforms

Each major public cloud provider has a hybrid cloud platform, such as AWS Outposts, Azure Stack and Google Anthos. These services aren't specifically edge computing solutions, but they can serve any kind of hybrid cloud computing need. IT teams can use such offerings to deploy public cloud services within a private data center or another local site that serves to host edge infrastructure.

For example, if a retailer wants to deploy a local payment processing application inside stores, while still managing the workloads using public cloud tools, it could do so on private, local servers that are managed through a platform like Azure Arc or Google Anthos.

Network optimization

Cloud providers offer services that can optimize network performance for workloads that need to connect to public cloud data centers, such as AWS Local Zones, Azure Fusion Core and Google Cloud Interconnect. These services work in different ways and cater to different types of use cases. However, they all have the potential to minimize latency for localized workloads that need to connect to the cloud. They can help optimize network performance for edge architectures that have a public cloud component.

IoT deployment and management

Although not all edge computing use cases involve IoT devices, IoT and edge architectures tend to go hand in hand. IoT devices are often deployed in scattered locations. Because these devices have minimal compute and storage resources, they need to connect to a remote data center to process or store generated data. While they could use a conventional cloud data center for this, edge infrastructure offers the advantage of minimized network bandwidth and latency.

All public clouds offer IoT services to help deploy and manage IoT devices, such as AWS IoT, Azure IoT and Google Cloud IoT. Although these services aren't limited to edge computing use cases, businesses that want to manage IoT networks through the public cloud can add such services as part of an edge management strategy.

Break down IoT architecture
Examine an IoT architecture.

Edge computing platforms beyond public cloud

You don't need to use public cloud services to set up an edge architecture.

One option is Red Hat OpenShift, the Kubernetes-based application management platform. OpenShift's multicluster management support can deploy and manage container-based workloads at multiple edge locations, with each location running its own cluster. Any Kubernetes distribution can help set up and manage edge workloads, although not all Kubernetes platforms are designed for edge use cases.

There is nothing stopping you from manually deploying and managing workloads across multiple edge locations yourself, without the help of automated orchestration and management services. You can manually stand up physical servers at multiple edge sites and then provide them with the OSes and workloads you want to run at the edge. But that approach is difficult to pull off at scale due to the management complexity.

Dig Deeper on Cloud deployment and architecture

Data Center
ITOperations
SearchAWS
SearchVMware
Close