Vernon Burke, SwiftWater Telecom
Recently, Google released a number of details about their formerly closely guarded data center techniques. In this article I take a look at the “per server” battery backup model. This model involves DC power distribution with individual backup batteries in every server.
The first item to look at is power efficiency. The claim is that placing the battery in the server reduces power loss and approaches 100% efficiency. On quick examination, we see that the power has to travel the same overall distance from the supply to the server regardless of where the battery is located in the circuit. So, what “efficiency” does this increase?
Placing the battery in the server increases the runtime of the server given the same battery capacity, since the finite amount of power in the battery is not being chewed up by losses in the cable to get from the battery to the server. Considering the tiny amount of time these servers will spend running from the backup battery, any efficiency gain is minuscule to start with. Consider also that the battery must be recharged from the power plant as well. It doesn’t matter whether you add more loss to the charge side of the battery and less to the load side, the overall loss and therefor the overall efficiency remains the same.
What Google does end up doing is shifting the loss from the battery (which can least afford it) to the main supply (which can most afford it). Not a bad move but not a miracle energy efficiency step.
Now, add on the cost of tens (hundreds?) of thousands of batteries, the maintenance of those batteries, and the ecological impact of disposing of them as they fail, and this becomes a really questionable proposition.
Finally, consider that this scheme only works for Google because there are so many redundant servers that they don’t really care if a significant percentage of them fail on power changeover because the batteries are bad. This method of operation would result in catastrophe for any normal data center provider that attempted it.
In summary, unless the laws of physics have somehow been repealed, the source of the claimed efficiency increases is not apparent, nor does the technique appear to be applicable to the vast percentage of data center providers.