https://www.techtarget.com/searchsoftwarequality/opinion/Docker-vs-VMs-Is-the-VM-all-that-bad
While many think VMs possess superior flexibility, security and support for specific software, containers are often seen as the go-to for development and cloud deployments with rapid scaling requirements.
With these respective opinions in mind, let's dive into the specific benefits of containers vs. VMs that can help guide the decision regarding which virtualization approach to adopt.
A VM enables one software system to run many underlying OSes. With VMs, users can run programs on their computers that the OS wouldn't normally support. Imagine a computer is running macOS, but a user wants to run a program only available on Windows. Instead of having to install a new OS on that same computer, the user could run a VM with a virtualized Windows OS to run the program. This is the kind of advancement in cloud computing that enabled companies like AWS to support a variety of OSes via cloud servers.
Despite their age, it's arguable that VMs still hold a distinct advantage in a few areas over newer containers:
The place where containers often have VMs beat is in resource consumption. While a VM gives each OS its own dedicated system resources, a container shares system resources with the host. A single container's footprint is much smaller, requiring only the libraries the container uses on top of the host's OS.
For cloud computing, this differentiation is important. Because each VM installs an entire OS, it's not always efficient to run many VMs on a single host. It's efficient to run many containers on one host, as the host OS is shared across containers. For modern applications that frequently scale horizontally -- like many servers running the same application -- a single server could run a limited number of VMs. However, the same server could likely run 10 times as many containers using the same amount of resources.
Essentially, containers greatly simplify the way applications run. You don't need to provision a server with a certain OS and specific libraries. Developers can then create containers for deployment, which themselves contain all the necessities for the application to run. Once deployed as a running container, it's trivial to scale up and down the number of running containers. This is why orchestrators like Kubernetes often scale quickly: There is little overhead to run another container, other than the system resources the containers already share.
07 Sep 2023