By 2020, half of the world’s businesses will have containerized applications in production, according to Gartner. They were 20% there is only … a year. These two figures are enough to illustrate the rapid and massive adoption of containers by companies – from small businesses to big accounts and startups – but they do not explain it. Why containers are needed in IT architectures? And why now since the very technology of containers was born a decade ago? Response elements.
Containers, armed arms of agile IS
For 20 years, information systems have not finished perfecting their evolution from monolithic computing to a much more composite model. The (old) SOA concept (Service Oriented Architecture) has evolved to many microservices today. In this architecture, the applications consist of autonomous services that can evolve independently of each other.
But what good is it to develop services in a granular and agile way, if the deployment, him, bumps on the variety of the operating systems, on libraries that can come into conflict or on variable network configurations? The digital transformation underway calls for global agility, end-to-end, from development to production. And that’s the whole point of containers. With them, modern architectures have found a strong ally to gain agility until production, including (but not only) when they exploit microservices.
Containers, a new portability / performance ratio
It is their portability / performance ratio that makes all the value of containers for modern architectures. Like an application bubble, the container embeds a software service but also all its dependencies (libraries and binaries), its configuration files, the versions of the exploited system tools (browser, shell, etc.) as well as the modified system variables (paths default access / path, registry, etc.).
By comparison with a virtual machine (or Virtual Machine, VM), there is missing in this list the operating system that the container, in fact, does not integrate. This is one of the key differences between VMs and containers: to run five VMs on a physical server, you need a hypervisor and five operating systems. With the containers, only one OS is enough, the five applications sharing the kernel of the operating system.
As a result, where a VM relies on a hypervisor that emulates a real machine, a container relies directly on the operating system’s Kernel. In other words, the container has both a high level of application isolation, which protects the application from subtle variations in the system that hosts it, and increased performance since a container uses fewer hardware resources than a VM.
Docker or the democratized container
Other advantages come from the nature of the containers:
- the compactness (a container only loads the application and its dependencies),
- the speed (a container starts instantly while a VM calls a real boot sequence)
- without forgetting portability.
Since VMs are very dependent on their hypervisor, going from a Vmware environment to Hyper-V is anything but harmless. Containers, for their part, benefit from Docker’s standardization of fact.
Difficult indeed to evoke the containers without speaking of Docker so much they became synonyms (even if alternatives to Docker exist).
And for good reason, Docker has really democratized the use of containers by simplifying and standardizing the approach:
- container format,
- instruction sets to create, deploy and manage them, image format to freeze a container in form Easily moveable file from one environment to another …
- Application virtualization layer that executes and controls the operation of the containers above the Kernel,
Docker is to the containers what the hypervisor is to the VMs.
The container, the natural ingredient of the cloud
This standardization of containers but also their compactness make them a natural ingredient of cloud architectures. Especially since each container can be instantiated only when necessary, its quick start penalizing little or no execution. These features are very significant in the cloud where billing is done according to the memory allocated, the power used, and the storage consumed.
In addition, the portability of containers makes the transition from development to operation, from one cloud environment to another, infinitely more reliable, predictable and reproducible. This results in significant time savings at all stages, thus tangible savings.
In short, with the containers, modern architectures made up of microservices and backed up by cloud resources have found their new currency In the same way that a common currency fluidifies trade and accelerates growth, containers boost the IT production chain, from design to deployment, in favor of the global agility of companies. Containers are now a currency for information systems that will have to be counted. And for a long time.