Tag Archives: microsoft

Finally, proof positive that PUE is garbage.

Vern Burke, SwiftWater Telecom
Biddeford, ME

I’ve just been reading a piece about Microsoft removing fans from their data center servers and that having a negative effect on their PUE numbers. I’ve written on this blog before about the problems with PUE, now we have proof that it needs to be put out of it’s misery.

In a nutshell, PUE is the ratio of power consumed by the IT equipment of the data center, versus the entire power consumed by the data center. A PUE of 1.0 would indicate a data center where all the power is being consumed by the IT equipment. A PUE greater that 1.0 indicates a data center where a certain amount of power is being consumed by other than IT equipment, the biggest chunk of which is cooling.

The problem I’ve written about before with PUE is the failure to take into account the actual work being accomplished by the IT equipment in the data center. Throw in a pile of extra servers just simply turned on and idling, not doing anything useful, and you’ve just gamed your PUE into looking better.

The problem shown here is even more damning. Microsoft determined that data center energy consumption could be reduced by removing the individual cooling fans from its servers and increasing the size of the data center cooling system. Since the increase in power for the data center cooling systems is less than the power required for the individual server fans, the data center accomplishes the same amount of work for less total energy consumption, an efficiency win in anyone’s book.

The side effect of this is that, even though the total energy consumption for the data center is reduced, transferring the energy usage from the fans (part of the IT equipment number) to the cooling (part of the non-IT equipment number) makes the PUE for the data center look WORSE.

Gaming the metric simply made it inaccurate, which was bad enough. Any efficiency metric that shows a net gain in data center efficiency (same amount of work accomplished for less energy consumed) as a NEGATIVE is hopelessly broken. This also has the side effect of making a mockery of the EPA’s Energy Star for data centers, since that award is based directly on the data center’s PUE.

Misleading, inaccurate, and now totally wrong, this sucker needs to go where all the other bad ideas go to die.

SwiftWater Telecom Green Eco Cabinet Filler Panels, insulated, lightweight, inexpensive


Friday data center tidbits: #datacenter failure excuses

Ok, so this one’s not so much an excuse as “bleep happens”. Northrop Grumman has apologized for the last major extended Virginia data center failure, sort of. Published accounts attribute the chaos to multiple failures of primary and secondary storage systems. From the article we get:

“In its apology, Northrop Grumman … went on to say that problems of this sort are not unusual with large technology transformation programs. ”

Um, yes, yes they are, when the program is being run competently. I’ve seen this excuse used a lot recently and it’s the one thing I could think of that will NOT inspire customer confidence in you. Northrop Grumman, this week’s winner of the special “bleep happens” bozo award.

In another example of how NOT to run things, Microsoft blew up Hotmail for at least 16 hours on Thursday. The automatic response “well it only affected a ‘small number’ of users” is nearly as bad as the Northrup Grumman “bleep happens” defense. Call this one “well at least it was only small scale bleep that happened”. Of course, how “small” can a 16 hour service outage really be?

Email or call me or visit the SwiftWater Telecom web site for green data center services today.


swiftwater telecom rcs cloud computing logo

Thursday data center tidbits: “ventilate” the server, Microsoft bumbles the cloud

First up today is the piece about an employee getting drunk and shooting up his company’s server with a .45 .I confess to having had the urge to kick a server from time to time but never to shoot one. If this isn’t a call for cloud computing, I don’t know what it.

Next up is Microsoft botching their cloud computing service for two hours. I don’t know what to add to this except, thanks for doing your part to add to the perception that cloud computing is unreliable, twits.

Email or call me or visit the SwiftWater Telecom web site for green data center services today.


swiftwater telecom rcs cloud computing logo

Wednesday data center tidbits: Microsoft’s new “hybrid cloud server”?

First up today is reading about Microsoft’s new Aurora small business server and it doesn’t add up. The article claims “Hybrid Small Business Server, codenamed “Aurora” can run on premises and cloud apps”. How does a local server “run” cloud apps? Here’s a clue: if it’s running on single on premises server (ie not a private cloud), then it’s NOT cloud.

Next up is the piece about Equinix building new data centers with raised floors. I don’t know, maybe their customers are insisting on raised floor because they’re stuck in the old common wisdom that that was the way to do it, but I can’t come up with a single good reason to use raised floors in ANY new data center.

Last up is a piece about putting a data center in a former Model T factory. This is a brilliant reuse of resources. It makes the best possible use of the embodied carbon in the existing building and it reuses a building that, for all it’s age, is perfectly suited for a data center (I run my data center in an 1800’s former New England textile mill). New isn’t necessarily better.

Tuesday data center tidbits: Microsoft eurekas, #cloudcomputing thought reversal

Wow! I don’t believe it! The uber geniuses at Microsoft have just announced the “non-intuitive” discovery that painting a data center’s roof a color that doesn’t absorb heat from the sun reduces cooling requirements! Sit tight boys, your Nobel Prizes are on the way!

Yesterday, I did a post on adjusting workloads to fit cloud computing rather than just deciding they weren’t appropriate for clouding. Check out this piece for another look at the reversal of thinking that allows almost anything to be clouded efficiently.

Email or call me or visit the SwiftWater Telecom web site for cloud computing services.


Friday data center tidbits: Intuit data center face plants, Google patents stacking, and more!

First up is the piece about the 36 hour failure of Intuit’s data center as the result of a power failure cause by “routine maintenance”. What is it with data centers that they can’t resist screwing with critical power facilities in the name of “routine maintenance”? This has been an ongoing theme in major data center outages for the last several years. Really, if your primary operating power system requires true “maintenance” (and not just BS things like measuring phase rotation in live panels just to check), then you should reconsider your design. Squeaky red noses to Intuit,

News of the ridiculous: Is it REALLY a patentable idea to stack data center containers?

Finally, there’s the next big data center money giveaway, to support Microsoft in Iowa. It must be nice to be that rich and still have state governments shower you with public money.

Email or call me or visit the SwiftWater Telecom web site for cloud computing services.


swiftwater telecom rcs cloud computing logo

Friday data center tidbits pt 2:who cares about PUE cheating?

In an article about attempting to move from PUE to a new data center efficiency metric, Christian Belady of Microsoft commented:

“There are perceived issues with it, comments about people cheating on PUE reporting. But who cares?”

Let’s see, now that PUE has been enshrined as a competitive marketing metric between data centers, first via marketing fluff, then as the basis for the EPA’s data center Energy Star rating, everyone who operates a public data center should care. I’ve had my conversations with Christian and I’ll be the first one to admit, despite it’s flaws, PUE is a useful INTERNAL metric, but when they start handing out rewards based on it, everyone should care that it isn’t being gamed.


swiftwater telecom rcs cloud computing logo