I'm trying to learn about virtualization and containers. In an educational video it is said that the introduction of virtual machines was a big deal because it made companies available to run multiple server instances on the same machine, so they did not have to necessarily buy a new machine just to run a new instance.
But what if I have a Node.js server application? I could run multiple of those on the same machine without any kind of virtualization. Why is even running multiple instances necessary? Why were virtual machines needed? My best guess is that Node.js is designed to support running multiple instances (the instances doesn't "know" about each other), therefore they are not needed, but back then the servers that they used would have "known" about each other if they weren't separated by virtual servers.
Do people do the same using containers? If I have a microservices architecture, and let's say that I only have one computer, am I using containers only to check whether they run correctly in different environments, or do I create multiple of even the same microservice?
Please excuse my sloppy wording, I mainly try to get what is it that I'm actually missing / misunderstanding, and I try to convey my abstract, high-level understanding of the issue.
Link to video: https://youtu.be/JSLpG_spOBM?t=77