When you're designing a microservices app, the natural inclination is to deploy microservices using containers. Containers have become the de facto solution for hosting microservices, thanks to their scalability and ability to host independent services as part of a cohesive application architecture.
But sometimes it makes more sense to run at least some of your microservices as serverless functions. Let's look at some factors that can help you choose between containers and serverless when deploying microservices.
Serverless functions vs. containers
A serverless function is a deployment model that allows developers to trigger the execution of software on demand. Using a serverless platform, such as AWS Lambda or Azure Functions, teams can upload code they want to run without having to provision a host server. This can also be done with a framework such as Knative, which makes it possible to run serverless functions inside a Kubernetes cluster.
The serverless platform configures the conditions necessary to deploy code into production. From there, the serverless platform automatically executes the code whenever the preconfigured conditions are met. The code runs to completion after each invocation, which then stops and waits to be called again by the serverless platform. In most cases, serverless functions are priced based on their execution time, meaning you pay only for the time they spend running.
With a container, individual applications or microservices run in relative isolation from the host server. To run software inside containers, developers must build a container image based on the code they want to execute, configure a hosting environment (using a technology such as Kubernetes) and deploy containers into it with the container image. Once deployed, a container runs continuously until it is shut down by either an engineer who spins it down manually or an orchestration service that terminates the container for you.
The pricing models for containers vary widely depending on which type of hosting model you use. In general, running containers requires you to pay for infrastructure resources on an ongoing basis. So, even if your containers are not actively handling requests, you still pay for the resources required to host them because the containers run continuously rather than on demand.
Thus, the key differences between serverless functions and containers include the following:
- Serverless functions run only on demand, whereas containers run continuously.
- Deploying serverless functions doesn't require the setup of a full hosting environment, but containers do.
- In most cases, serverless functions are deployed on proprietary cloud services. In contrast, containers can run easily in a wide variety of environments, including public cloud and on premises.
- Containers incur costs even when an application is idle. With serverless you pay only for the time the functions run.
Using serverless functions to host microservices
Now let's look at what those differences mean for hosting microservices.
If you choose to deploy one or more microservices using serverless, those microservices will be active within your application only when they are invoked by your serverless platform. Unlike containerized microservices, they won't be switched on, ready to receive requests from other microservices.
In addition, the deployment and update processes for functions-based microservices are different. Rather than pushing out a new container image inside an orchestrator to update your microservice, you'd have to deploy a new version of the serverless function.
This can be a more complicated and manual process than updating a containerized microservice; generally, serverless hosting platforms don't integrate as tightly with CI/CD pipelines. This means that you can't automate as many of the tasks required to move updated microservice code from your development environment into a production serverless platform.
It's also likely to be more difficult to integrate serverless microservices into a service mesh if you use one to help manage your microservices. A service mesh offers limited support for proprietary serverless platforms, such as Lambda, but strong support for Knative-based functions. However, they are primarily designed to manage communications between containerized microservices. Configuring your service mesh to work with serverless functions would be extraordinarily complicated.
When to use serverless for microservices
Whether the differences between serverless and containerized microservices create an advantage or disadvantage depends on what you plan to do with your microservices.
A microservice could be a good candidate for deployment as a serverless function under certain conditions. Those would include the following:
- The microservice provides functionality -- such as handling an unusual type of user request -- that is required only occasionally and is not a core part of the application.
- The microservice is not updated frequently. Or, if is, you can manually deploy the updates or use the limited CI/CD integrations of serverless platforms to apply updates automatically.
- You don't need to integrate the microservice tightly into your service mesh, or you don't use a service mesh for your application.
Serverless and containers: Better together
Be aware you can use serverless functions alongside containers within the same microservices app. Some of your microservices can run in serverless functions while you deploy others with containers.
This approach requires more effort. You'll have to manage both serverless functions and containers inside the same app. Even so, it gives you access to the best of both worlds, allowing you to deploy some microservices using serverless functions while sticking with containers for those that aren't a good fit for the serverless model.