I was reading a story about whether data center uptime (as defined by tier level) effects the data center’s efficiency. I’m sorry but running 3 hot redundant A/C units in a data center where you only really need one is a very real waste of money and energy, whether the EPA declares it not statistically important or not. There’s an old saying about lies, damn lies, and statistics.
Next up is the piece about the potential for Wall St companies to rent excess off hours capacity in their data centers. These people are hysterical over the undefined security risks of cloud computing and vowing en mass not to put any critical functions on a cloud, but they want to rent out their main trading systems for who knows what running on them. Is there some magic incantation that makes renting their own servers to others more secure than a cloud computing provider? How do these guys manage to hold two mutually exclusive ideas at the same time without their heads exploding?
Email or call me or visit the SwiftWater Telecom web site for truly green data center services!