Tag Archives: server

The Christmas data center list for Santa.


I’ve been extra good this year, so I thought I’d share my data center Christmas list to Santa (we installed a chimney just for him). I know Santa isn’t the most savvy tech guy, so I attached full specs for everything on the list.

1 new server, multiple AMD Opteron 6 core CPUs, tons of RAM, huge RAID storage array. Heck, I’d really like a whole rack of them (do I go on the naughty list if Santa gets a hernia moosing this down the chimney?).

A full 100GBs Ethernet network. Well, those fancy servers aren’t just going to throw paper airplanes at each other!

A new building would make a top of the line present. See, there’s this great empty place across the street that would be perfect for a 100K sq ft ultra green data center (ok, maybe I havn’t been THAT good).

I’d like a new Tux the Linux Penguin beanie to replace the one that mysteriously walked away from my desk this past year.

I’d like a whole bunch more customers like the ones we acquired over the last few months. Friendly, easy to deal with, and appreciative for what we do for them, they’re a pleasure to work for (and if it’s a pleasure, it’s not really work 🙂 ).

Finally, just to show I’m not selfish about asking for things for only me, I’d like to add on a wish for the former landlord who gave us so much heartburn this year. Santa, if it isn’t too much trouble, could you gift wrap him a transmission failure in the middle of Harlem at 2am? Please and thank you!

Or did that just land me on the naughty list?

Vern, SwiftWater Telecom

Data center vendor bozo of the week …


And the winner is Microsoft!

I spent much of the day yesterday working with a colleague to replace an ancient customer owned server running Windows Server 2000 on consumer grade hardware with a nice new machine running Windows Server 2003 R2 on top notch server grade hardware. While I am primarily a FreeBSD and Linux guru, this was far from the first Windows Server machine for both of us.

After cruising easily through the setup wizards, we were left with a strange problem. The server could see the router, surf the net, and accept incoming connections from outside of its own subnet, but it would refuse traffic from any machine in it’s own subnet. Even stranger, it would respond to pings from other machines on the same subnet while it was booting, but at some point in the boot process, it would stop responding.

After much unpleasant language and threats of electronic mayhem, we discovered the answer. Apparently because this server had 2 Ethernet NICs, the all knowing setup wizards decided to activate NAT to bridge between them. No option was ever given to do this during the setup, someone just thought it would a good idea to set it and then bury the setting to the point that it took mining gear to dig it out.

For this spectacular display of bozosity in not asking if we wanted this capability turned on or not, we award Microsoft the coveted multi-colored wig and red squeaky nose of shame.

Vern, SwiftWater Telecom

Top 10 ways to make sure your servers make it to the data center …


Top 10 ways to make sure that your equipment survives its trip to the data center!

So, you’ve decided to take the plunge and ship your equipment to a data center install for you. These tips will make sure everything arrives intact!

1. Whenever possible, use modern hard drives spec’d for high shock loads.

Far and away, most damage that occurrs to customer equipment in transit is mechanical shock to hard drives, usually from being dropped or tossed around.

2. Equip your package with a shock indicator.

Shock indicators are inexpensive devices available from many shipping supply companies that provide a permanent indication that the package has been dropped, thrown, or otherwise abused in transit. An indispensable aid for claiming insurance for damaged shipments!
3. Leave plenty of room in the box around your equipment.

Leaving room greatly decreases the odds that your equipment will be damaged by crushing of the package or impact from outside the package.

4. Use double-walled boxes.

Double-walled boxes are much stronger and more impact resistant.

5. Use anti-static packing material.

Many standard packing materials generate large amounts of static electricity that can damage your equipment. As a rule of thumb, anti-static materials are pink in color.

6. Avoid using loose packing peanuts.

Loose packing peanuts will allow your equipment to settle through them, eliminating the buffer zone between your equipment and the outside of the box.

7. Don’t skimp on the bubble wrap!!!!!!

Remember to wrap all accessories and cables included in the
package!

8. Double tape all seams!

Great packing material does no good if your equipment ends up
on the ground because the bottom tape of the box gives way!

9. Ship equipment assembled in its case rather than sending
individual components wherever possible.

10. Make sure sensitive equipment is marked “sensitive”, “fragile”, “handle with care”, “do not bend”, etc.

Warning the shipper of handling requirements for your equipment
is the easiest way to increase your odds of it reaching it’s destination intact! For best results, use preprinted warning stickers in bright, eye catching color patterns.

Double check these things before your server goes out the door and you and your data center will have a super smooth installation!

Vern, SwiftWater Telecom

data center server co-location

Top 10 ways to make sure your data center co-location goes right the first time!


So, you’ve decided to take the plunge and take your server to a data center to install yourself or you’re packing it up to ship to a data
center to install for you. These tips will make sure things go right!

1. Make sure you have an appropriate mounting method and it meets what the data center expects.

Rule of thumb, 1U or 2U cases less than 20″ in depth or most 3U or 4U
cases can be mounted with fixed ears directly to a 2 post rack, 4
post rack, or a cabinet. Full length 1U or 2U servers 3U or 4U
servers that are balanced with most of the weight in the rear must have
slide rails to be supported correctly.

2. Make sure that your server case uses a front to back airflow path.

Some cases use a side to side, front to top, or even back to front
path. It’s best to avoid these odd cases and stick with the standard
front to back path.

3. If you intend to use an Ethernet switch and cable to a server with rear Ethernet ports, make sure you’ve planned on at least 1U of space
between to get the cables from front to back!

Or plan to mount the Ethernet switch on the back side of the 4 post
rack or cabinet.

4. Make sure you have the proper type of network connection available!

You’ll have an unpleasant surprise if you need a fiber Ethernet connection and you only have a Cat5 connection available!

5. Make sure your fiber connection matches the data centers!

Make sure that you have either single mode or multimode fiber jumpers as needed (single mode jumpers are orange, multimode jumpers are yellow). Make sure you have the right connector types (LC and SC are most common).

6. If you expect your data center to make serial port connections for you, make sure to not only include the proper size and gender cable but also a null modem adapter if required!

7. Make sure that you supply a PDU (power distribution unit) if the
data center is not supplying one.

Make sure your PDU is compatible with the voltage and format of power
your data center is supplying to you!

8. Make sure that you supply the proper format of power cord to attach to the PDU.

Typical PDUs use a 5-15, C13, or C19 line plug to attach. The data center power connection, if your supplying your own PDU, is typically a 5-15, 5-20, L5, L6, L14, or L15 twistlock.

9. Make sure that you have the BIOS of your server configured to support USB keyboards and not to halt booting on a missing keyboard error.

10. Make sure that you have the BIOS of your server configured to automatically restart after a power failure.

Double check these things before your server goes out the door and you and your data center will have a super smooth installation!

Vern, SwiftWater Telecom

server co-location, web hosting, data center services

Greening the data center: Out with the old …


This evening I’ve been reading a blog article about The Planet running tower cases in their data centers. I can’t see for the life of me how this makes sense.

It’s certainly true that tower case setups offer flexibility. There’s space for pretty much whatever add in cards you could want and plenty of room for ridiculous amounts of drives. That there’s more room for air flow inside and it’s easier to get the heat out of them with lousy cooling is not in doubt.

On the other hand, with 1TB drives common and inexpensive, is there really a need for a dozen drive bays? Especially since the trend is well away from massive amounts of server attached storage in the data center? Not to mention the amount of power low utilized drives waste. Not very green at all.

Since even the most compact of 1U server configurations can be had with almost everything desirable for ports, controllers, and video, is there really the need for major amounts of slots anymore? It seems to me that most of the upgrades that would be put in such a system would be absolutely useless in a server (who needs a gamer video card in a co-located server?).

What is a point is that there is a massive difference in space consumed between towers and rackmount cases. The Planet seems to think that’s good, since low density means less heat and less power. Unfortunately, it also means less revenue for the facility. Operating inefficiently because we don’t want to bother with a good cooling design is a lousy tradeoff.

The biggest nail in the coffin to towers in the data center is, how do you control cooling air flow? Towers on open racks would be virtually impossible to separate the hot exhaust air from the cold intake air. It’s the nightmare of anyone who cares in the slightest about greening data centers.

There was some economic justification to doing this 10-15 years ago. It’s 2009, time to relegate long obsolete data center design to ancient history.

Vern, SwiftWater Telecom

data center, web hosting, Internet engineering