There are probably a lot of technologies you must learn in order to master the modern development and DevOps ecosystem but Docker (and of course orchestration and the containers’ ecosystem) is one of the important skills to have nowadays.
https://www.gangboard.com/operating-system-training/docker-training
2. 8 Good Reasons to Learn Docker
There are probably a lot of technologies you must learn
in order to master the modern development and DevOps
ecosystem but Docker (and of course orchestration and
the containers’ ecosystem) is one of the important skills
to have nowadays.
Optimize Hardware Usage
Distribution & Collaboration
Multi Tenancy & High Availability
CI/CD — Build Once, Run Everywhere
Isolation & The Dependency Hell
Using The Ecosystem
Development in a cross-platform environment is easier
Deployment of Applications is easier
3. Optimize Hardware Usage
Like virtual machines, containerization changed the
way we optimize our hardware usage. With virtual
machines you can have some limits.
Let’s imagine a scenario where you should host two
applications using the same language but not the
same version. You may find a solution or probably a
hack to do it in a single VM, but you can finish by
creating a new VM in order to host the second
application.
VMs can have some limits, but using Docker you
can easily deploy multiple applications, using
different versions of the same language without
creating a new VM.
4. Distribution & Collaboration
If you would like to share images and containers,
Docker allows this “social” feature so that anyone
can contribute to a public (or private) image.
Individuals and communities can collaborate and
share images. Users can also vote for images. In
Docker Hub, you can find trusted (official) and
community images.
Some images have a continuous build and security
scan feature to keep them up-to-date.
5. Multi Tenancy & High Availability
Using the right tools from the ecosystem, it is easier to
run many instances of the same application in the same
server with Docker than the “main stream” way.
Using a proxy, a service discovery and a scheduling tool,
you can start a second server (or more) and load-balance
your traffic between the cluster nodes where your
containers are “living”.
6. CI/CD — Build Once, Run
Everywhere
Docker is used in production systems but it is considered
as a tool to run the same application in developer’s
laptop/server. Docker may move from development to
QA to production without being changed. If you would
like to be as close as possible to production, then Docker
is a good solution.
Since it solves the problem of “it works on my
machine”, it is important to highlight this use case. Most
problems in software development and operations are
due to the differences between development and
production environments.
7. Isolation & The Dependency Hell
Dockerizing an application is also isolating it into a
separate environment.
Same thing as the first point, imagine you have to run
two APIs with two different languages or running them
with the same language but with different versions.
You may need two incompatible versions of the same
language, each *API* is running one of them, for
example Python 2 and Python 3.
If the two apps are dockerized, you don’t need to install
nothing on your host machine, just Docker, every
version will run in an isolated environment.
8. Using The Ecosystem
You can use Docker with multiple external tools like
configuration management tools, orchestration tools, file
storage technologies, filesystem types, logging
softwares, monitoring tools, self-healing tools ..etc
On the other hand, even with all the benefits of Docker,
it is not always the best solution to use, there are always
exceptions.
9. Development in a cross-platform
environment is easier
If you are like me, you love your Mac running OSX.
The problem is, most of the code that I deploy runs on
Linux based machines. I develop code to run on the
cloud or I develop code to run on embedded systems
like a Raspberry PI.
However, I can’t give up OSX as my daily use operating
system and development environment. Linux just
doesn’t cut it for me as a general use operating system.
In my opinion, it’s ugly and just doesn’t have all the
applications I need to get through my day.
10. Deployment of Applications is easier
If you have ever tried to deploy a software application
with a large number of software dependencies, you
know it can be a very painful process. On a Linux based
system, it often involves a lot of apt-get installs for
different software projects just to be able to run your
code.
With Docker, all of those dependencies are built into the
image itself. This means that when you want to deploy
your application built into a Docker image, all you need
installed on the system itself is Docker plus the image.