Software containerization with docker reviewed

by | Feb 29, 2020 | Docker

Docker software containerization reviewed

Putting Software into containers seems to be state of the art. But what are the benefits? Are there any drawbacks? Most people have heard about docker technology. And not less have used docker. But we are looking towards more recent technologies like Kubernetes. So we forget dockers’ real intentions. Let us review docker software containerization and rate its benefits.

The beginnings of software distributions

A few years ago, creating software distributions had less focus. The main work completes when finishing the development. Installing Software is a poor man’s job. 

Creating an installation guide is the most straightforward way to install Software. In consequence, for any target environment. Such a guide summarizes all necessary steps to install Software. Even at any target environment. In this way, the operator installing a software package must read this guide. Also, the installation guide is a synonym for a checklist. And installing Software is processing the checklist step by step.

Target environments are complex and vary in many ways. To illustrate, shell environment settings. Or available third party software packages. In many iterations, the checklist extends and becomes more complex. Processing the checklist becomes a time-consuming job. Any new target environment cost extra effort. Because of the human factor, the error rate raises with growing checklists.

Assume that operations use more than one software. In that case, whole folders with installation instructions exist. In consequence, with any further Software, the folder grows more and more.

Automated installation of Software

With a rising number of target environments, the efforts grow up. Time is money, and operation aspects get more in focus. Automation of the installation process saves much time. The calculation is simple. Profit grows when costs for automation are less than operating costs. Automating the installation process is cost and benefit accounting. We have to decide what is the appropriate level of automation. In many cases, a piece of Software becomes installed ten to a hundred times.

So, the operation engineer becomes happier than today. At first, he loses much stupid work. And second, he may do more software installations at the same time.

All is fine with software distribution?

The problems from the past are solved. But evolution is a never-ending process. Checklists and automated installations save a lot of time and money. Unfortunately, there are many issues left. So environments differ in many ways, and it becomes hard to take all facets in an account. For example, changes in the shell environment cause conflicts. Or package manager updates add incompatible revisions of tools.

Solving conflicts become challenging. Accordingly, when different software packages share the same environment. The checklists or installers do not respect other non-standard tools. How should they?

And finally, after solving each conflict, the operating system expires and cannot be updated. As a result, we format the disk and install a new and empty new operating system. And daily the groundhog greets when starting installation again.

How to make software distribution better?

The Software itself is not the problem. But the installation in a specific environment. There are many tools, like Chef, Puppet, or Ansible, addressing this issue. They share the same way by doing their best to fit Software into an environment. They form a kind of universal installation mechanism. Unfortunately, these tools are not trivial to use. They are sturdy but sophisticated tools.

Are there no different approaches? We think about live CDs with preinstalled Software. Such Software comes with its environment. It is easy to use because of no complex installation process. Notably, quickly cleaning it up. Running full virtual machines on the local workstation is expensive. But the idea of running Software in a closed environment is good.

How nice would it be if the Software could run in a closed container? The container includes all requirements for running the Software. And deleting the container would remove all artifacts from the workstation. Just a fiction?

The Docker technology

With docker technology, this vision becomes a reality. The containerization of software distributions. Docker packages the Software into containers. Such a container includes all runtime dependencies. Also, the docker container makes the Software look like an operating system environment.

In comparison to a virtual machine, the docker container is slim. There are many benefits:

  • Starting and stopping of Software becomes unified. Starting a container is regardless of container contents. 
  • Software installation becomes unified. Installing new Software means to deploy a container.
  • The container offers an isolated environment. So the container separates operating system resources. In this way, resources like sockets or file system contents. By this, two or more processes may use the same port and do not conflict — no more conflicts between containers.
  • Software rollouts or rollbacks become comfortable. Just deploy a new container and remove the old one.
  • The container runs a single operating system process. Container processes and classic processes coexist harmfully.

Conclusion

Docker technology reduces operational efforts. The software expert is the developer. He packages his Software into a container. No other person has to deal with specifics of the software installation. Or from an operational point of view. So the operator treats all Software in the same way. All Software is the same, just a container with a different label.