"Container based virtualization" - most likely known by the name of "Docker" - has become "mainstream" or even some kind of "commodity" at Haufe. Docker can be found as part of the internal build-infrastructure (build agents for gocd etc.), a single deployment artifact for simple setups (container images for proxies, webserver, ...) or as a base technology of more complex szenarios e.g. solutions deployed into a hosted Kubernetes cluster.
Three years ago (~2015), container virtualization and orchestration was more a "do it yourself scenario and required a deep understanding of cloud technologies and Docker to get anything started.
Today, services at Docker, Google, Amazon and Microsoft take care of nearly "everything" any project requires to use or create Docker images, provide run time environments or to take care of operational tasks (monitoring, logging, ...).
While some "container related battles" have been decided in favor of one of the factions (think orchestration with Kubernetes vs DockerSwarm), the container technology stack is still evolving and keeps changing or adding new features (Docker Windows containers, anyone?). To the advantage of users, developers and engineers, almost all of the technological changes happen "outside" docker images and keep breaking changes at bay.
Step by step, Docker is being reduced (again) to encapsulate well designed and implemented software components and cloud services take care of the heavy lifting of infrastructure and operations.
Docker will be around for quite some time and will be accompanied by "new" technologies like "serverless" (aka lambda, Azure functions, ...) or Docker is already one of the technological foundations new services are built on (ML with AWS Sagemaker)
Docker is a technology to implement the infrastructure as code principle. It allows automated and reproducible builds and deployments. Automated deployments are a must have when migrating solutions to the cloud.
Docker is currently the most-used solution for creating and managing container-based infrastructures and deployments.
Essentially, Docker is a platform to build container images, distribute them and run them as an isolated process (using Linux kernel cgroups, network namespaces and custom mounts).
In a DevOps environment, this helps a lot as we can run the exact same software and runtime (such as PHP) on both production and locally while developing. This enables us to debug our software much easier.
Also, Docker allows us to keep our development setup much smaller and faster; instead of VirtualBox setups on a per-project base, we can compose our project development setup out of small containers. A CI environment building the containers allows us to package and test the whole environment instead of different software components on different runtimes in a much more stable way.
Backed by services such as Kubernetes, we can deploy Docker containers on a flexible infrastructure and enable our developers to test their software more easily in different environments.
Here at Haufe, we assess Docker in different projects to become more flexible and faster, which increases our focus on development of even better and more stable software.
All application that run on AWS or Azure use docker. This is an incomplete list of projects.
Thomas Schüring firstname.lastname@example.org