I'm trying to get a better understanding of the reasons to use [and not use] Docker based on specific use cases.
From my current understanding, Docker helps to isolate applications and their dependencies within containers. This is useful to ensure consistent reproducible builds in varied environments.
However, I'm struggling to understand the rationale of using Docker where the environments are essentially the same, and the applications are relatively simple.
Say I have the following:
- a cloud VM instance (DigitalOcean, Vultr, Linode, etc.) with 1Gb RAM running Ubuntu 20.
- a Node.js Express app (nothing too complicated)
The following issues come to the fore:
Dockerizing this application will produce an image that is ~100Mb after optimization (without optimization probably 500Mb or higher based on my research). The app could be 50Kb in size, but the Docker container dependencies to run it are significantly higher by a factor of up to 10,000 or above. This seems very unreasonable from an optimization standpoint.
I have to push this container image to a hub before I can use Docker to consume it. So that's 500Mb to the hub, and then 500Mb down to my VM instance; total of about 1Gb of bandwidth per build. Multiply this by the number of times the build needs to be updated and you could be approaching terabytes in bandwidth usage.
I read in a https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-22-04 that before I can run my container image, I have to do the following:
docker pull ubuntu
This pulls an image of Ubuntu. But, I'm already on Ubuntu, so does this mean I'm running a container that's running Ubuntu inside an existing VM that is running Ubuntu? This appears to be needless duplication, but I'd appreciate clarification.
- The https://docs.docker.com/desktop/install/linux-install/ specify that I should have 4Gb RAM. This means I have to use more expensive VM instances even when my application does not necessarily require it.
How exactly does containerization [using Docker or similar] optimize and enhance the DevOps experience, especially on an ongoing basis?
I'm not quite getting it but I'm open to clarification.