Vern Burke, SwiftWater Telecom
I just read a piece discussing the debate over whether cloud computing actually saves energy or not. This piece shows that you can get silly about trying to quantify the unquantifiable.
The argument is really very simple here. The idea is that the energy cost of the networking equipment required to move data back and forth to a cloud computing provider offsets the energy gain from an end customer dumping an inefficient local data center in favor of cloud computing. If the original claim of cloud computing energy saving is over simplified, so is the argument against it.
First, any claim about the energy cost of the network would require a detailed analysis of how much energy is required to move a bit of data from point A to point B. Good luck trying to account for every single piece of power consuming electronic equipment in any path through the Internet. Good luck trying to account for every single piece of power consuming electronic equipment in every POSSIBLE path through the Internet that the data could take. Remember, you not only have to account for the networking equipment but also the communications carrier’s facilities that provide the link.
Second, it’s easier to make this claim when you have a local data center that’s only supporting local users. Add in remote users or add in a public facing website and now you’re generating cross Internet traffic from nearly anywhere. Take the first issue and multiply it thousands of times.
In reality, the energy cost of moving data is pretty low compared the 95% under utilization rate commonly found in small local data centers. Wasting that much energy leaves a LOT of room to absorb the energy needed to ship bits across the network.
Add to this the massive economies of scale involved in cloud computing and I’d be hard pressed to see a situation where the energy cost of the network outweighed the energy saving of cloud computing.