Factor performance into an application modernization strategy
Dev teams increasingly use cloud, containers and microservices to modernize apps, but these technologies bring little value unless they boost performance, which is no guarantee.
Some enterprises think all you need to do to modernize legacy applications is redeploy them to the cloud or move them to a container. But it's rarely that simple. Cloud migrations and other approaches to modernization don't automatically translate into what matters most to the business: better application performance.
Whether they choose the cloud, containers or microservices, developers should think carefully about performance as they pursue an application modernization strategy.
Modern tech is not a performance cure-all
The goal of modernization is to move an application from a legacy deployment model to one that is more agile, but that doesn't guarantee better performance. And, after all, no one is going to choose to do business with you simply because you host your applications in the cloud or deploy them with Docker and Kubernetes. Customers and end users don't care about that; they care about how well your applications run.
If your application suffers from underlying performance problems, such as memory leaks or network bottlenecks, these problems will persist after it moves to a modern environment.
To ensure optimal performance, developers need to understand the nuances of three common application modernization strategies: cloud migration, containerization and microservices adoption. These strategies are not mutually exclusive. For example, you could containerize applications and then deploy those containers in the cloud. But when you implement at least one of these modernization strategies, it's critical to understand the performance nuances involved.
Cloud
When you move an application to the cloud, you will likely see an immediate performance boost, as hosting resources are available on demand and virtually without limit. As a result, an application that suffers from a memory leak might run faster in the cloud, where there's no limit to memory resources. Based on end users' perspectives and metrics, such as average response time, your app might appear to perform better in the cloud.
If your application modernization strategy centers around cloud, don't confuse metrics like response time with performance.
But that perception is deceptive. To refer back to the memory leak example, that issue won't go away on its own, and you'll pay more for the extra memory that the app consumes in the cloud. In this case, cloud migration doesn't solve the performance problem; it just slaps an expensive bandage on it and leaves you with ever-increasing technical debt.
If your application modernization strategy centers around cloud, don't confuse metrics like response time with performance. Pay attention to cloud costs, which largely reflect performance optimization. If you spend more on cloud resources for your app than you would to maintain it on premises, that's a sign the app has underlying performance issues.
Containerization
Containers create a similar challenge. They enable you to deploy an app in an isolated, lightweight virtual environment without the overhead of traditional VMs, which leaves more resources available for app use.
This means a poorly performing app might run a little faster inside a container than inside a VM, as the container host server has more resources it can expend on the app. But, ultimately, you will still waste valuable system resources if the app's design is inefficient.
What's more, containers themselves can have unique performance issues. If you containerize an application without correcting underlying problems, the app might experience poorer response time -- on top of inefficient use of system resources -- and your application modernization strategy will backfire spectacularly.
Microservices
If you refactor a monolithic application to run as microservices in a loosely coupled architecture, the apps, in theory, can take advantage of greater, more finely controlled scalability, resilience against security intrusions and more seamless updates.
Yet, when you move a legacy app to a microservices architecture, some thorny performance risks can arise. Unlike monolithic apps, microservices typically rely on the network to communicate with each other via APIs. Network problems or poorly written APIs can quickly degrade performance. Think carefully about where you host each microservice to optimize communication and ensure high performance.
Additionally, it is difficult to manage microservices-based apps. While tools, such as load balancers and orchestrators, can help tune microservices performance, you need to have the right staff and processes in place. If you don't specify who owns each individual microservice, some might not receive the proper maintenance. Also, ensure that microservices retain compatibility as they evolve, especially if different teams develop and maintain them. Complexity and ownership issues can slow continuous delivery processes and delay innovation, which is sure to cause business value questions about the modernization strategy to arise.
These examples hint at why an enterprise might stick with a monolithic app over microservices. If you don't have the resources to manage a microservices architecture, there's no shame in monoliths.
Dig Deeper on Cloud app development and management