Tag Archives: free air cooling

Wednesday data center tidbits: no power backup in the #datacenter?


First up today is about the idea of building a data center with no power backup at all. This is about as boneheaded a thing as I’ve ever seen. Does it REALLY pay you to not only duplicate but run extra infrastructure so you can save a bit in equipment costs by letting a data center fail? What about the cost of restoring all the downed equipment? Or the damage to equipment from wild power fluctuations that a battery backed system (such as our 48V DC power plant in our data center) would absorb? Squeaky red noses to Yahoo on this one.

Next up is a piece about improving data center airflow. What caught my eye was this, “…flowing air from the CRAC unit, through specialized floor tiles and into the server fans…”. Attempting to tout cooling efficiency with a hopelessly obsolete raised floor is an automatic FAIL.

Email or call me or visit the SwiftWater Telecom web site for cloud computing services.

Vern

Data center DC power, power backup runtime, and free air cooling, a greener shade of green.


I’ve written previously here about the green, energy saving benefits of DC power in the data center, the reliability follies of super short run time power backup, and, of course, the well recognized benefits of free air cooling. In this post, I’m going to discuss making the best green use of all three of these together in the data center.

The “classic” data center power backup system is the double conversion UPS. In this scenario, commercial AC power is rectified to DC for the backup batteries and then inverted back to AC to supply the data center equipment. This configuration actually has three points of efficiency loss, the rectifiers for the AC to DC, the inverters for DC to AC, and the load power supplies for AC to DC again. The data center DC power plant does away with 2/3 of the efficiency loss by eliminating the DC to AC inverter and the AC to DC power supply in the load equipment.

The second part of this equation is the backup power itself. The trend to incredibly short run time backup power (such as flywheels with only 15 seconds of run time) is a foolish gamble that fallible generators are going to work perfectly every time. If there’s even a small issue that could easily be dealt with, there simply is no time and the facility is going down hard.

The third part is the free air cooling. It really goes without saying that using cooler outside air for cooling is far more efficient than any type of traditional data center air cooling.

So, how do these three things tie together to make the data center even greener than any one separately? Many data centers use load shifting to save power costs (such as freezing water at night when power is cheaper to cool with during the day). I call this technique heat shifting.

My data center is equipped with an 800A 48VDC modular power plant equipped N+1, a battery string capable of 8 hours of run time, and free air cooling. The idea is to simply pick the hottest part of the day (usually early afternoon) and remove heat load from the free air cooling by shutting down the rectifiers and running the facility from the battery string for 2 hours.

This shifts that part of the heat load of the data center to times when the free air cooling is operating more efficiently, allowing the free air cooling the elbow room to support more productive equipment load. Additionally, you have the side effect of exercising the batteries regularly, avoiding the ills that can plague idle batteries, such as stratification and sulfation.

As if there weren’t already enough great reasons to use green DC power, long run backup, and free air cooling in the data center, here’s another one.

Email or call me or visit the SwiftWater Telecom web site for green data center services today.

Vern

swiftwater telecom rcs cloud computing logo

The green data center, avoiding natural (and unnatural) disasters. #datacenter


I’ve just been reading the back and forth over the recent volcanic eruption in Iceland and it’s impact on data centers versus the ” natural disasters happen everywhere” folks. Yes natural disaster happen but it doesn’t mean you have to invite them into your data center.

First, let’s consider DCK’s question about the effect of volcanic ash on the data center. Volcanic ash is incredibly abrasive (which is why you don’t want to fly an airplane through it), it’s electrically conductive, and it contains sulfur dioxide, which form sulfuric acid in the presence of water. Not the kind of thing you’d want in any mechanical equipment (fans on air handlers, etc), not the kind of thing you’d want in any electrical or electronic equipment (shorts galore), not the kind of thing you’d want in any delicate equipment (corrosion city).

It’s certainly true that you could avoid this to some extent by building a data center upwind of potential volcanic activity. The problem is, “upwind” isn’t fixed in stone and volcanic ash can travel a long way.

It’s also true that there are natural disasters everywhere. It’s also true that all natural disasters are not alike. A blizzard has the potential to interrupt data center operations. Planting a data center on the Gulf coast or mid Atlantic East coast puts it dead center for a major hurricane that could very possibly flatten it. Park your data center on top of a high active seismic or volcanic area and you’re at risk for a “smoking hole” disaster. Locate on top of the Yellowstone super volcano or in a place like Naples Italy (within the kill zone of Mt Vesuvius) and the most probable natural disaster suddenly rises to “not survivable, period”.

Freak natural disasters can come out of the blue. This year we had an interruption at our primary data center due to an unusual spring storm packing hurricane force winds. We simply migrated services to our backup data center (120 miles away) and, once power was restored, restarted the facility, no damage at all.

The economic lure of things like data center free air cooling in Iceland, but the bull’s eye target is like a rattlesnake’s rattle. Heed the warning and don’t roll the dice with your data center.

Don’t wear a target, email or call me or visit the SwiftWater Telecom web site for truly green data center services!

Vern

swiftwater telecom rcs cloud computing logo

Tuesday data center tidbits: ASHRAE and free air cooling and more #datacenter


First up is reading about the uproar over ASHRAE adding free air cooling as a specific method of data center cooling to it’s standards. The attack reaction of Google, Microsoft, Amazon, etc to this is curious, since this doesn’t MANDATE the use of free air cooling at all. Of course, that wouldn’t make any sense because there are areas where the ambient temperature is so high that free air cooling couldn’t be used at all for a data center. When you have a major overreaction to something like this, it makes me wonder what else might be going on here.

The next piece is about whether distance makes a difference in data center co-location. It’s been hard enough to get people to stop hugging servers in the first place, even with all the economic benefits of co-location, just the thought of sending their server across country can make a lot of them break out in a sweat.
Stop hugging your servers today! Email or call me or visit the SwiftWater Telecom web site for co-location services and a happy home for them!

Vern

swiftwater telecom rcs cloud computing logo

Thursday data center tidbits.


This hasn’t been a good week for major Internet services. First up was Wikipedia going splat yesterday due to an overheating situation in the data center and then a failure of their failover procedure and here. I won’t start on the failure of their data center provider to detect and head off whatever infrastructure failure cause the overheating before it reached catastrophe point, but discovering your failover plan is garbage during said catastrophe is just pathetic.

Update:Yes, I will start on them. DCK reports that the EU data center that Wikipedia is in uses free air cooling. How in HECK do you have a major cooling failure when Mother Nature is doing the work?

Next up is the failure of YouTube’s main web site for approximately 90 minutes this morning. No word on the cause of this one yet but I think it must be something in the water.

Email or call me or visit the SwiftWater Telecom web site for cloud computing services and reliable green data center services Mother Nature doesn’t go on strike on us!).

Vern

swiftwater telecom rcs cloud computing logo

The green data center, the EPA, Energy Star, PUE, and total cluelessness.


I’ve just been reading a post about the EPA’s new Energy Star rating for data centers. It’s astonishing when you see the EPA having this big a bozo attack.

The first item on the list is the use of PUE (power usage effectiveness) as the criteria for awarding the Energy Star rating (I have no idea what happened to the EPA’s EUE metric they were trumpeting just a few months ago). In simple terms, PUE is the total amount of power used by the data center vs the amount of power used by the IT equipment in the data center. The problem with this approach is that it doesn’t take into account the actual amount of work being accomplished for that power, so it actually encourages the deployment of idle and underutilized equipment to skew the PUE metric for publicity and marketing purposes. Not very green at all.

PUE can be useful for evaluating the effect of changes made to a single facility, but it’s worse than useless for trying to compare multiple non-homogenous data centers. It’s so easy to skew that any comparisons are not only misleading, they’re downright dishonest as well.

The really astonishing thing is to hear the EPA Energy Star manager claim that the Energy Star rating didn’t have to take into account climate differences due to data center location because they didn’t have any statistical effect on PUE. This is the most clueless thing I’ve ever heard.

I’m sure all of these major companies that have located data centers in cooler climates to take advantage of free air cooling to drastically reduce electrical consumption required to cool the data center will be shocked to hear this (free air cooling circulates cool outside air to remove heat without any requirement for chillers). All of the massive power savings trumpeted in the news lately must just be a myth. Keep on mind, reducing cooling power usage reduces the overall power usage of the facility, driving down PUE.

Even for legacy cooling systems, cooler outside air increases efficiency. The larger the difference in temperature, the easier it is for the cooling system to dissipate or sink the waste heat. This means less energy involved in removing the same amount of heat.

To say that these techniques have no statistical bearing on PUE flies in the face of the laws of physics and just plain reality. This isn’t just some theory, it’s being done and it’s being proven out.

I wrote a week ago that this process was going to be a real circus and it’s nice to see them bearing me out. Alexandra Sullivan, EPA Energy Star manager, and the EPA as a whole get an en mass award for Data Center Bozos of the Week. The bulk shipment of multicolored wigs and red squeaky noses is on the way.

Call or email me or visit the SwiftWater Telecom web site for hype (and bozo) free green data center and cloud computing services.

Vern

swiftwater telecom rcs cloud computing logo

Greening the data center : Is bigger better?


Tonight I’ve been reading about big vs small data centers. So, in terms of green, how big is too big?

So, is bigger or smaller better from a green standpoint? There are pros and cons to both. The bigger the facility, the more economic advantage from large buying, as well as standardization. On the down side, large open spaces make it much more difficult to control airflow, which is a key to greening the data center.

Large facility roofs provide great locations for green solar power but also absorb large amounts of heat from the sun. Also, the larger the facility, the more strain placed on local utilities, such as power and water for cooling.

On the other hand, small facilities can also be difficult to ventilate properly for cooling. On the plus side, small facilities can be tweaked easily to optimize them for the specific environmental conditions. Also, there’s the disaster avoidance benefit of not packing everything into a single enormous facility, avoiding single points of failure.

So where does this leave us? I think the gargantuan facilities will be few and that modestly sized facilities will be the easiest to green, right sized for optimal cooling, not so large as to stress local utilities.

Like a lot of other things, extremes either way tend to be less than optimal.

Vern, SwiftWater Telecom

data center, web hosting, Internet engineering