In this article, we will try to familiarize you with the topic of virtualization, containerization, and Docker. What is this software and why do we use it? What are the differences between virtual machines and Docker? All this and more in this artile, we encourage you to read on.


Basic concepts

To understand what virtualization is and how Docker software works, it is necessary to understand the basic terms:

  • Container - a standardized unit of software. It contains a package of code along with the dependencies required to run it, for example (JS Code + NodeJS interpreter).
  • Docker - a tool that allows us to place our program along with its dependencies (libraries, configuration files) in a lightweight, portable, virtual container that can be easily run on a server.

Thanks to containers, we can run our software on any system on which Docker is installed.


Purpose behind using Docker

If you have any experience working as a programmer in a team, you have probably experienced technical problems related to software development. Docker won't help us with all the coding difficulties, but it can eliminate the most frustrating obstacles, such as:

  • Different production and development environments - if we use Python 3.9 and its new features when coding our application locally, but the production server has a Python 3.3 interpreter, our application may not work until we upgrade the interpreter version on the server. It is easier to deliver a container to the server that has the appropriate version on which our program should be run. It is a good practice to test and build on the same versions in both the production and development environments.
  • Different environments between team members - different members may have different versions of dependencies, such as different versions of Python or libraries. One might think that updating to the appropriate version is not a big deal, but:
    • the lack of certain standardization means that there is no guarantee that our code will work on someone else's machine every time;
    • updating can be complicated with large dependencies.

By enclosing the code in a container, we provide a certain standard of software in which the code is run, and we can be sure that it will always work.

  • Different versions of tools/libraries between projects - each team member may work on many projects, each of which may require tools/libraries in different versions. Changing versions may require reinstalling libraries/tools, which Docker can avoid by installing dependencies inside the container.

Simply put, Docker standardizes the way we run applications and ensures that if the application runs on one environment, it will run the same on another - provided, of course, that Docker software is installed there.


Virtual Machines vs Containerization

Virtualization is the simulation of the existence of logical resources through software that uses physical resources. Virtual machines can run an operating system inside another system. The host system then assigns certain resources to such a virtual machine. The key difference between containers and virtual machines is that the latter virtualize the entire machine including the operating system, sometimes referred to as the Guest OS. Containers virtualize only the software layers above the operating system level and are often used in microservices architecture, for example.

Picture 1: Comparison of virtual machines and containers. Source: Atlassian.


Docker installation

The installation of the software depends on the operating system, that's why we have decided to present the most popular option for Windows. You should:

  1. Enable virtualization in BIOS and activate Hyper-V functionalities in the system ->
  2. Then follow this guide from the official Docker website ->
  3. For Windows, you should also install WSL to optimize Docker's performance ->

After the installation, you should be able to execute the "docker" command in any chosen terminal (PowerShell, cmd).


Images vs containers

Beginners in containerization often confuse these two terms at the beginning, and their understanding is crucial for effective work with Docker:

  • Image - a file used to execute code in a Docker container. Images function like a set of instructions for creating a container, similar to a template. An image is comparable to a snapshot in virtual machine technologies.
  • Container - a standardized software package that packages code and all its dependencies. It is a lightweight, self-contained, executable software package that contains everything needed to run an application: code, runtime, system tools, system libraries, and settings.

An image is therefore a file from which we produce instances of our applications. It's a bit like a class in programming and an object that is an instance of that class. If you're not a programmer, you can think of an image as a recipe and a container as a finished dish (you can make the same dish multiple times from one recipe).



A Dockerfile is a file that contains configuration information used to generate an image. In other words, we define from which interpreter and libraries our running application should use.

Picture 2: Example Dockerfile for running a React.js application.


From the defined Dockerfile, an image can be generated, and from the image, a container can be run. If you feel lost due to new concepts, you can refer to the following image from

Picture 3: Workflow that accompanies the creation of a container. Source:



We hope that we have presented the reasons why it is worth learning Docker and how it can help in daily work. In the next article, we will try to dive into how this software works and teach you the basics of handling it.