I finally understand Docker and virtual machines feel irrelevant now
Virtual machines are extremely easy to set up, use, and replicate. Over the years, I've always had a virtual machine handy for various key reasons, and they've never failed me. So much so that I didn't even consider using Docker for years. That said, virtual machines aren't perfect, and that led me to try Docker again. It finally clicked, and understanding Docker has changed my perspective on development environments. The virtual machine trap I used to think VMs were the only way to isolate apps As mentioned before, virtual machines are convenient to work with. They've been the go-to solution for everything from testing a different OS to needing to isolate a project for testing. Anytime I needed a clean or familiar environment, I would spin up a VM and get to work.

Virtual machines are extremely easy to set up, use, and replicate. Over the years, I've always had a virtual machine handy for various key reasons, and they've never failed me. So much so that I didn't even consider using Docker for years.
That said, virtual machines aren't perfect, and that led me to try Docker again. It finally clicked, and understanding Docker has changed my perspective on development environments.
The virtual machine trap
I used to think VMs were the only way to isolate apps
As mentioned before, virtual machines are convenient to work with. They've been the go-to solution for everything from testing a different OS to needing to isolate a project for testing. Anytime I needed a clean or familiar environment, I would spin up a VM and get to work.

However, VMs by their nature come with massive overhead. You're running an entirely different OS on top of your existing OS any time you run a VM. On my 16GB laptop, I could only run about two VMs simultaneously before I started running out of memory.
That would be fine if I were testing an OS—a task that requires loading the entire thing. However, for individual apps and modules, you don't actually need the full OS to load. As long as you have the required dependencies, you can run just about anything you want with a much smaller package size.
Then there's the issue of speed. A VM can take a couple of minutes to boot every single time. That's not too much of an issue for someone who periodically fires up a VM to test something out. However, when you're in an active development environment where machines need to be frequently restarted to test apps, stability, or just to get rid of errors, those minutes start adding up.
Last but not least, VMs can easily take up a couple of GBs on your storage drive, even if you haven't installed an OS. Most VMs I created had dynamically allocated storage drives, which meant I eventually had hundreds of GBs worth of storage locked away for VMs I wouldn't even use all that much.
How Docker finally clicked for me
Containers as lightweight snapshots instead of full systems
The biggest difference between Docker and VMs is that Docker containers aren't mini VMs—they're more like smart packaging for your applications. It's important to understand the difference between VMs and containers.
These containers share their kernel with the host OS. This means they don't need their own OS, just the app and its specific dependencies. However, if you need to, you can easily run a full OS like Ubuntu as a Docker container.

All this gets tied in a neat package, which comes in at several MBs instead of GBs. This makes Docker containers extremely lightweight and resource-efficient.
Compared to the minutes it would take a VM to boot, Docker containers start almost instantly. When I'm developing and need to restart something to test changes, I no longer have to sit twiddling my thumbs while my developer environment boots up. The containers restart almost instantly and are ready to go within seconds.
Another advantage of using Docker is that when you containerize an application, you're packaging everything it needs to run, including the exact versions of libraries, dependencies, environment variables, and even the specific OS layers it expects. When someone else runs your container, they're getting the same environment you developed in.
This saves a ton of time that would otherwise be wasted in dealing with version conflicts, missing dependencies, or subtle differences in development and production environments. Docker renders the "it works on my machine" excuse obsolete in I'm all for it. That said, there are several Docker best practices you should follow for the best results.
All these benefits also lead to a much smaller overhead when running containers. You can run hundreds of Docker containers on a machine that would only handle a couple of VMs. Whether you're running your containers locally or on the cloud, this directly saves you money in terms of hardware.
There's a reason why Docker is becoming an increasingly sought-after skill for backend software engineers. Thankfully, it's easy enough to start with, especially if you nail the essential Docker commands for beginners. There are also some essential Docker topics for beginners you should familiarize yourself with if you're just getting started.
Microservices make sense now
Docker turned the buzzword into something practical
Before understanding Docker, the microservice architecture sounded like a nightmare. I'm not a fan of splitting up my code into multiple files or services. However, I've also never had to scale an app to support a massive number of users, so regular VMs and servers meet my needs just fine.
Splitting up a nice and simple monolithic app into smaller services can still be a nightmare to manage if it's a personal project where you don't really care about efficiency. However, since each microservice becomes a self-contained unit in Docker, you can keep your app running even if a service crashes.

This means that if you're developing an app for the masses, you won't be facing app crashes the second something in your code breaks. Additionally, scaling up individual parts of your app becomes incredibly easy and cheap.
For example, if you anticipate a heavy flow of traffic for an e-commerce website due to Black Friday sales, you can scale up your payment processing service. This saves time and money by scaling just the service that's going to face the brunt of the traffic, keeping the rest of your infrastructure optimized and running. Even if a particular service crashes, your app is still accessible.
Many big tech products run thousands of microservices this way. It's complex, yes, but it's not complexity for complexity's sake. It becomes simpler to manage when each piece is properly containerized.
VMs aren't obsolete yet
They're still great for certain workloads
Docker containers can't put VMs out of business just yet. There are plenty of ways in which you'd prefer a VM over a container, especially if you require full OS isolation for security or testing reasons. Additionally, if you need to run legacy applications that require specific OS versions or configurations, VMs are the way to go.
However, for modern app development, Docker can prove to be a game-changer. It's quite easy to install Docker on Windows 11, and you should give it a shot if you haven't already.
It's faster, more efficient, more portable, and most importantly, cheaper to run compared to VMs. The future is clearly containerized, and I finally understand why everyone's been so excited about it all this time.
Share
What's Your Reaction?






