buchachon - Fotolia

Tip

Choose the right serverless container service

Many IT pros consider serverless containers to be largely hype, while others say it offers real advances in serverless computing. See how the major cloud providers take their swing at this class of service.

Many organizations will face a choice between serverless and container deployment models. However, cloud providers' serverless container offerings can give IT teams a mixture of both.

Serverless computing lets a company install an application or application component without tying it to any specific infrastructure. This means IT teams don't need to manage VMs or container hosts.

Serverless computing offers functions, which are small snippets of code that load and execute tasks in response to events. Functions are a shift in thinking from how we run traditional applications, although customers must keep an eye on the usage billing afforded by serverless computing as it can lead to good and bad surprises. Containers offer a way to package and isolate software for deployments that can be more efficient than VMs. The combination -- serverless containers -- can provide the benefits of both these technologies.

But before moving forward with serverless containers, IT teams should be thoughtful about what's driving them in that direction. Consider these questions:

  • Are you looking for operational simplicity -- the "no infrastructure" model -- or are you looking for usage pricing?
  • Have you tried serverless development and found it too restrictive, or are you finding that some container applications use resources inefficiently, thus leading to higher costs?

Keep these questions in mind as you explore serverless containers and the various services from the major cloud providers.

Serverless container concepts

Despite the common serverless terminology, don't think of serverless containers as an evolution of serverless functions. Instead, think of them as an evolution of container deployment and pricing.

Serverless containers are transient components of traditional applications that also process events. Some users will also find serverless containers better for event handling because their event processing needs are persistent.

The concept behind serverless containers is to decouple containers from any specific infrastructure so that the containers can be deployed dynamically, almost a form of long-lived function. Unlike functions, serverless containers can persist for as long as the user would like, though there's a practical limit on persistence set by the difference between the pricing of serverless container usage and other container hosting options.

More on this topic

Serverless in cloud is more than running code without managing infrastructure. Developers and admins need to learn key concepts, mitigate security issues and control costs, as well as choose a service that best suits their enterprises' workload needs and business goals.

Essential serverless concepts to master before deployment

What the critics get wrong about serverless costs

How to address and mitigate serverless security issues

Learn from these real-world AWS serverless examples

How to apply serverless in front-end cloud computing

Serverless container offerings

Choosing the best serverless container option, or even determining one's value, depends on various factors. There are significant variations across cloud providers, and serverless container pricing is hard to compare with functional computing serverless tools, basic containers, managed containers and managed Kubernetes options.

Most cloud pros who report benefits from serverless containers select options from their primary cloud provider, and choose this type of container because of previous inefficiencies and high costs with basic containers.

AWS

Amazon's serverless container model is based on AWS Fargate, which lets users deploy containers more easily in an Amazon Elastic Kubernetes Service or Amazon Elastic Container Service installation. However, AWS Fargate doesn't fully abstract infrastructure like AWS Lambda. And while it simplifies container deployment, it's not instant and still requires operations coordination.

AWS container offerings

Instead, AWS Fargate provides rightsizing for container application and installation. Customers pay for what's allocated, but without the same dynamism as Lambda, although Fargate assumes true statelessness and doesn't support persistent storage. Amazon's partner, Spotinst, offers Ocean, a container infrastructure manager and DevOps tool that lets AWS spot instances and other AWS compute instances share a cluster. This is the preferred approach for IT pros who want to mix serverless and traditional containers.

Microsoft Azure

Microsoft's Azure Container Instances (ACI) takes a different approach compared to AWS. This option decouples serverless containers from infrastructure and orchestration. A user deploys an ACI almost as they would a function, meaning they don't worry about lifecycle management or infrastructure. This is a significant benefit for users who like serverless but don't like the restrictions on how functions are run in serverless mode.

It's usage rates that make or break all serverless business cases.

In particular, Azure exposes ACI with persistent addresses, and uses persistent storage and GPU resources. However, ACI is similar to docker run commands, and Microsoft recommends its Azure Kubernetes Service where it requires full orchestration.

Google Cloud

Google Cloud Run is the serverless container implementation closest to functional computing. It fully abstracts infrastructure and infrastructure management and fully integrates with public IP addresses -- but not with persistent storage. It also supports GPU hosting.

Cloud Run can also support event triggers, so it looks like either a dynamic container or a more persistent function. This serverless container option fully supports autoscaling, including scaling to zero to completely eliminate costs.

Finally, Cloud Run now supports Anthos, Google's hybrid and multi-cloud integrated orchestration option. Add this to Cloud Run's close adherence to functional serverless principles, and you have a powerful choice.

Key takeaway

Serverless containers aren't right for every situation. Most container users would probably want a traditional managed container service. Consider serverless containers as a special adaptation to low-usage container applications that aren't easily translated into function form.

And remember, it's usage rates that make or break all serverless business cases.

Dig Deeper on Cloud app development and management

Data Center
ITOperations
SearchAWS
SearchVMware
Close