Tag Archives: facebook

Lipstick on a pig: Facebook’s data center refit.


Vern Burke, SwiftWater Telecom
Biddeford, ME

I’ve just been reading an article today about Facebook retrofitting a data center and all the great energy efficiency gains. Unfortunately, sometimes the best retrofit method for a data center is dynamite.

Most of the modifications mentioned have to do with airflow. Now, I’ll be the first one to cheer for improving and controlling airflow for improving data center efficiency. The problem is, how BAD does your airflow situation have to be to have to run the cold air temperature at 51 degrees F?! I though data centers running in the 60s were out of date, 51 is just pathetic. It’s obvious that there was certainly room for improvement here, but the improvement effort only got them to 67 and that’s still lousy.

The big problem here comes from continued reliance on the obsolete raised floor as a plenum design. There are certainly far more reasons not to use raised flooring in a data center, including unproductive floor loading, expense, fire detection and suppression requirements, under floor housekeeping, metal whisker contamination, and a whole host of airflow issues. Since the Facebook retrofit is all about the airflow, I’m going to just address the raised floor airflow issues.

If you’re really serious about balancing your data center airflow, using a raised floor as a plenum is the last thing to do. First, under floor obstructions make smooth airflow next to impossible, even if you’re totally conscientious about housekeeping. Second, there’s zip for fine control of where the air is going. Need to add just a small amount of air here? Sorry, you take multiples of full tiles or nothing. Third, pull a tile to work on underfloor facilities and you immediately unbalance the entire system. Pull a dozen tiles to put in a cable run and you now have complete chaos across the whole floor. Finally, make any changes to your equipment and you have to rebalance the whole thing.

These things are so inefficient that it isn’t any wonder that a lousy design would need ridiculously cold air to make it work. 67 is certainly an improvement, now they’ve gotten things up to being only 5-10 years out of date.

When Facebook actually retrofits a data center all the way up to modern standards, I’ll be impressed. This operation is still a pig underneath, no matter how much lipstick you put on it.

Advertisements

Friday data center tidbits: data centers gone wild, Facebook behind the times


First up today is the piece about the federal government “finding” 1000 more data centers than they thought they had. First of all, how does your data center inventory get so bad that you lose track of 1000 data centers? Second, how in the world do you allow the data center sprawl to get so far out of control? That’s a total of 2100 federal data centers, an average of 42 data centers for every single state. Last but not least, who in the world would think it’s a bad idea to consolidate these?

The federal goverment, data center bozos of the week, the truckloads of red squeaky noses are on their way.

The next piece is about Facebook saving big by retooling its data center cooling. Really, is it big news that not mixing cold intake air with hot exhaust air is a good idea? If Facebook is pushing this as a “look how great we are” point, they’re about 5 years too late.

Finally, here’s a survey about the causes of data center downtime. Not maintaining the data center backup batteries and overloading the backup power are just plain silly, but the telling one for me is 51% accidental human error, including false activation of the EPO (emergency power off). I’ve said it before, the gains that the National Electrical Code allows the data center in exchange for the EPO are NOT worth this kind of failure. EPO=EVIL, period.

Email or call me or visit the SwiftWater Telecom web site for green data center services today.

Vern

swiftwater telecom rcs cloud computing logo

Thursday data center tidbits: Chaos at Facebook (again)


For more than 5 hours now, developers have been unable to make changes to the Facebook Integration settings for their apps. Attempts to change these settings returns:

Sorry, something went wrong.

We’re working on getting this fixed as soon as we can.

This failure doesn’t seem to be affecting apps that are currently running but it has dragged a fair amount of app development right down to a total stop.

This failure comes close behind the recent major Facebook outage caused by a runaway software integrity checker.

Vern

SwiftWater Telecom

Monday data center tidbits: Unhyping the cloud, Facebook INsanity checks


First up today is a piece about cutting out the cloud computing hype. The problem isn’t so much the hype over cloud computing as it is the rampant outright misuse of the term cloud, attaching it to things aren’t even remotely cloudy in an attempt to ride cloud computing’s coattails without actually making the effort to DO cloud computing.

Next up is the recent Facebook service fiasco. Getting your system to mark it’s own code as invalid and not hand the problem to a human to validate before taking radical action is especially brain dead. On top of that, now you have an error compounded on top of the original error and you have a cascading failure, all because of no sanity checking and no break for human input. Facebook gets our data center bozos of the day award for trusting too much in automation and then blowing it.

Email or call me or visit the SwiftWater Telecom web site for green data center services today.

Vern

swiftwater telecom rcs cloud computing logo

Thursday data center tidbits.


First up is the story about North Korea creating their own Linux distribution. I’m waiting for someone to ask for this on a data center server (go ahead, I dare ya!) :).

Second is the piece about Goldman Sachs following Facebook and Google by putting backup batteries in servers. It’s amazing how a bad idea can have legs some times (have fun recycling all that toxic heavy metal guys!).

Email or call me or visit the SwiftWater Telecom web site for cloud computing services and green data center services. (no, we don’t support North Korean Linux!)

Vern

swiftwater telecom rcs cloud computing logo

Wednesday data center tidbits.


First up today comes the updated list of who has the most web servers in their data centers. I was pondering Google having in excess of 450,000 servers and matching that against all the recent publicity about Google having the backup batteries in the servers. What a phenomenal expense that would be, not to mention all the toxic heavy metals, combined with an absolute nightmare to maintain all those batteries. Geening the data center is an admirable goal but I don’t see this even close to being practical.

Best non-answer answer of the day: Facebook’s response to criticism over it’s new Oregon data center being powered primary by coal. Yes, we know Facebook has all these wonderful efficiency programs, but they didn’t actually respond to the specific criticism raised. When these big companies saturate the media with PR about how green they are, this kind of criticism is absolutely valid.

Call or email me or visit the SwiftWater Telecom web site for green data center and cloud computing services minus the hype.

Vern

swiftwater telecom rcs cloud computing logo

Wednesday data center tidbits.


First up this morning is the news about HP’s new 20 foot data center container and here. Let’s do a little math on this. This container is $600,000 for 500U of space. Just space, no IT equipment, no cooling, no power gear, just a steel box with racks. That’s $1200 per 1U. I’m pretty sure I can rehab enough conventional floor space for 500U of equipment for FAR less than $1200 per 1U. Overpriced doesn’t begin to describe it.

Next up is Intel building 8 huge solar power arrays, 3 of which are in Oregon, which is surprising, given that Oregon touts the tremendous amount of dirt cheap hydro power they have. There’s really no better use for the wasted space of a flat building roof.

Vern

swiftwater telecom rcs cloud computing logo