denisismagilov - Fotolia
An overview of Knative use cases, benefits and challenges
Knative might be more complex than a serverless platform in the public cloud, but for IT shops already committed to Kubernetes and microservices, it could be worth the leap.
Most people associate serverless computing with the public cloud. But some enterprises find that cloud-based serverless deployments don't fit their needs. In these cases, companies can implement another serverless model: Knative.
First, evaluate specific Knative use cases, benefits and challenges, particularly as compared to public cloud-based alternatives. These two serverless frameworks have key differences in terms of cost, resource utilization and management overhead.
What is Knative exactly?
Knative is a Kubernetes-based platform to load and run serverless functions on demand. Enterprises can deploy Knative in any environment -- cloud or on premises -- in which Kubernetes can run.
Architecturally, Knative is an event-steering layer that sits between Kubernetes, the container orchestrator, and Istio, the open source service mesh technology. Kubernetes handles deployment, while Istio connects distributed components, or functions, to each other and to users. Because both Kubernetes and Istio are dominant in their respective areas -- container orchestration and service mesh -- Knative's core components are likely familiar to enterprises.
Knative has three basic pieces. The first -- Knative Build -- builds container images from source code to make them available. The second -- Knative Serving -- uses Kubernetes and Istio to deploy and connect these images through assigned infrastructure resources. The third -- Knative Eventing -- lets users define event triggers and associate them with containerized functions.
When Knative recognizes an event, it locates the associated process and runs it on demand. Knative commits hosting resources only when a process runs, so there's no need to allocate container pods, nodes and clusters for work that could happen, but isn't happening at the moment. Knative, in this sense, balances container and serverless benefits.
Cost and resource utilization
Serverless platforms in the public cloud handle load and execute functions based on triggers. The only limit to the number of function instances is the cloud's capacity and any user-defined restrictions on maximum instance counts. For all practical purposes, these serverless deployments are limitless -- but so are the costs. An unexpected number of function trigger events can increase serverless costs significantly.
Traditional container applications have a relatively fixed number of instances, so the resources available for processing work are constant, as are the costs. This is great for consistent workloads, but if workloads are highly variable, admins must size applications for the maximum work levels expected. This means average workloads leave resources under-utilized.
Knative enables IT to create a container-based cluster of resources, and then allocate those resources via Kubernetes to specific trigger events. Any Kubernetes cluster, whether in the cloud or the data center, can serve. And if Kubernetes federation is in place -- through a platform such as Google's Anthos, for example -- that federated Kubernetes framework can collectively host serverless functions. The cost, and the resources, available for function hosting are fixed, which is great if workloads are predictable.
Knative use cases
Knative is best for applications that generate a variable number of trigger events that, over time, stay within established limits. While a single application can meet this requirement, it's more likely met by a combination of three or more applications whose triggers aren't synchronized.
Specific Knative use cases include IoT, network monitoring, application monitoring, website testing and validation, and mobile app front-end processes that act as event generators. This event orientation is important; if IT teams can't visualize an application as a series of events, rather than as a series of transactions, Knative might not be a good choice, for both efficiency and functional reasons.
Let's look at the efficiency challenge first.
A Knative deployment that supports a proper mix of applications likely offers better serverless performance at a lower cost than public cloud services, even if some or all the containers are hosted in the public cloud. An improper mix of applications, however, could result in either underutilized container resources and higher costs, or a resource shortage under load that compromises application performance. This is the biggest challenge of Knative -- the wrong applications, or a poorly sized resource pool, can destroy all its benefits.
Perform tests to validate a mix of applications and resource quantities on Knative. Measure the applications' event loads, sizing the maximum and average loads for each. To estimate total resource consumption, time Knative process deployments based on expected event levels. Do this for several applications to create a trial configuration, and then test that configuration to validate the estimates. Even then, phase into deployment and modify the implementation, resource configuration or both, as the test validates.
Knative's functional challenge is two-pronged: First, Knative is based on functions or microservices that fit a stateless model, meaning no data is stored in the software component itself. Development of these functions isn't difficult, but requires a shift in approach -- and mistakes mean software won't perform as expected. Second, because business data normally consists of multi-step transactions, stateless functions must maintain context throughout all steps. Knative doesn't provide that capability, as public cloud serverless tools do; IT teams need to provide state control on their own.
There's also an operational challenge with Knative, as compared to serverless offerings in the public cloud. With the latter, admins don't manage the underlying servers. With Knative, they not only need to manage servers, but also containers, Kubernetes, Istio and Knative itself.
That said, for enterprises already committed to containers and Kubernetes, Knative only minimally extends development and operations complexity. And those committed to microservices and service mesh will find Knative to be a nearly natural extension of capabilities. The more an organization plans to use microservices and serverless technology, the more likely Knative will offer lower costs than cloud provider alternatives.