This is my second post in the series about the costs of server virtualization. So, what is it about the “hay argument”: ten horses require ten times more hay than one horse? Do ten servers always consume less energy than one? I don’t think so. How much hay or power is consumed does not depend solely on the number of workers performing a certain task.
I suppose a well-fed sumo wrestler consumes ten times as much food as a ballerina. However, ten ballerinas have a fair chance to floor a sumo wrestler. Power consumption is connected to the amount of work that has to be done. Thus, if a high-end server has to do the work of ten average servers, it obviously needs a lot more power than each of them does.
Of course, most supporters of the power savings claim are aware of this fact. Their point is that servers also consume much power when they are idle. Since a high-end server running ten virtual servers is seldom idle, it will use energy more efficiently. This argument is certainly valid if you assume that the ten average servers are oversized for their tasks and, therefore, often consuming energy without doing valuable work.
However, there are other factors to consider here. Much of the electrical power a server consumes is transferred into heat that has to be abducted. The problem is that heat abduction for an object with a small surface is more difficult than for a big object. This is the reason why the fans in computers are becoming bigger and bigger over the years. I remember quite well that my first computer didn’t even have a fan. The point here is that fans require energy too. The more CPU power that a computer of a certain size has the more energy you will need to get rid of the heat. Obviously, a high-end server has less volume than ten average servers do; this means that you need more energy for cooling this server than for multiple servers having a bigger surface.
Even though the high-end server is well equipped with fans, its CPUs, power supplies, and hard disks will always operate at a higher temperature than those in the ten average servers. This is where the second argument comes in: You will waste more energy for heat (thermal energy) if you run your system at a higher temperature. Imagine a sprinter who has to run one hundred meters in ten seconds. Even if he is well trained he will be sweating a lot. Now imagine ten average persons who have to walk ten meters in ten seconds. Together they also covered one hundred meters, but they won’t sweat. They won’t feel as exhausted as the sprinter will, even if you were to repeat the experiment ten times. However, our runner would most likely be dead afterwards.
I know this metaphor is a bit far-fetched, but I think you understand my point: you can’t always save energy by simply reducing the number of workers. Many factors have to be considered here. If you were to replace ten old servers with one new high-end server, you would probably not measure reduced power consumption afterwards. Perhaps five new average servers without virtualization technology would have been able to do the job as well? Would those five average workers have needed less energy than your sprinter?
If, and by how much, you can reduce in power costs with virtualization technology depends heavily on your environment. I doubt whether a general formula exists, that can calculate that for you. Personally, I do think that one can save power in some cases by virtualizing servers. However, it seems to me that the benefits are often exaggerated by vendors trying to sell high priced high-end servers or virtualization software.
I searched Google for a scientific study that would extend the analysis beyond the typical “idle time argument” but I was not able to find one. This made me even more suspicious. If you are aware of such a study from an independent scientific institution I would be very much interested.
When it comes to the costs of server virtualization, the power savings are outweighed by other factors anyway. This will be the topic of my next post in this series.
Articles in this series:
- Seven disadvantages of server virtualization
- Does server virtualization reduce costs? Part II – Power savings
- Does server virtualization reduce costs? Part I – Hardware expenses
- Does server virtualization reduce costs? Part III – Software and payroll costs