nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

MOVE IT! 10 top tips for shifting your data centre

From cable ties to rack size – sweat the small stuff

By Dave Cartwright, 18 Mar 2015

The scenario's a hauntingly familiar one. You're the IT person who's just been told by the boss: “We're moving the kit to <insert name of whatever data centre he's signed up with in a panic>, now get on and do it.”

I've done more than my fair share of migrations from on-premise systems to data centres – and more often than not the move has been driven by some disaster or other that's hit the office and nobbled one or more important applications.

Here's my ten steps to setting up your kit in a data centre.

1. The connection

If you're putting your corporate systems in the data centre, you presumably still need to be able to connect to them from the office and to be sure that the apps are as responsive and snappy as the users need them to be. Although you can theorise that it's OK for things to be a little slower than they were with an on-premise installation, the reality is that the users won't be convinced and so you have to be sensible about connectivity.

This doesn't mean, though, that you have to go completely mad and buy a super-fast point-to-point link. I've run perfectly workable services over a VPN on a decent high-speed internet connection for a few hundred quid a month without the users complaining once. What matters is that the connection you choose is fast enough and sufficiently stable for performance to be consistent. What I'm saying is: don't go out and get a crappy DSL circuit and expect the users to be happy – spend what you need to on either a decent VPN-enabled internet connection or a leased line, but don't go mad and buy bandwidth you're never actually going to use.

2. Security

Before you head to the data centre for the first time, remember that if their security is any good then the security bod on the front desk isn't going to let you in unannounced. First, make sure your boss has put you and whomever you're taking with you (the only person who'll thank you for installing rack-mounted servers single-handed is your osteopath as he hands you the bill) on the access list, and that you know where your bits of the world are, just in case you get the out-of-hours guard who doesn't really have a customer interface or any knowledge of how to look up which racks you're renting.

3. Your data centre toolkit

Anyone who visits a data centre regularly has a standard kit of stuff to take with them. Some bits are obvious: cable ties and velcro loops, a variety of patch cords of different colours and lengths, power cords (with the right plugs and sockets), screwdrivers and a bag of rack nuts and bolts. Other bits are less obvious. The two most useful things I've ever owned were a pair of four-way adaptors. Both had four UK sockets; one had a US domestic plug and the other an IEC-13 (kettle-style) plug. Assuming you're in a UK data centre the in-rack power strips will have either UK or IEC-13 sockets: you're always short on sockets in the power strips (hence the four-way for your laptop and phone charger for those long night-time upgrades) but your rack is always miles from the nearest wall socket, so you'll need to plug in within the rack.

Another thing you'll want to have with you is an up-to-date copy of your infrastructure documentation on your laptop or tablet: if you're in the data centre fixing a problem, there's a decent chance that the problem will prevent you from accessing the server with the documentation on. Finally, make sure you have all the right cables: some will live in your laptop bag (my USB-to-RS232 adaptor stays with me, for instance, as it needs a driver to work with Windows and I want to know for certain that my laptop has a working serial port). The rest need to live in the on-site stash – back to that later.

4. Rack design

Before you rock up to install the kit, design the rack layout. Doesn't have to be a Dali-esque masterpiece in Visio – in fact I use a spreadsheet for all of mine – but it has to be sensible. Start with the keyboard/monitor shelf – they've got to be located at a height where you can actually use them. Next come the heavy servers: try to put them near the bottom so you don't have to risk life and limb on a stepladder to install them (and if they're rail-mounted, you really don't want the balance issues of a 30-kilo monster sticking three feet out of the rack at a height of six feet). Next, think where you're going to put the switches, routers and firewalls – make sure you can see all the flashing lights you need to see and that they're placed such that you can run cables to them sensibly. And ensure that the kit's labelled and that the diagrams are annotated sensibly: it's a whole lot easier to do this before you install it all than digging around in a rack.

Make sure the rack design includes every last detail, including (and especially) power and network cabling. Use the right length cables and plot the location of every device's network and power connection. If your in-rack power strips are IEC-13 format, then beware of any gadgets that are powered from UK-style transformer plugs – make sure the data centre provider gives you provision for this form factor with the sockets spaced such that you can actually fit the power supplies next to each other in the strips.

5. Cable management

Cable management: the most tedious thing in the world, and the thing you'll be thankful for if you do it properly. Don't just cram all the kit in the rack: make sure that you put cable management units in so you can run the cabling in a fashion that lets you work with the wiring later. And where rail-mounted servers have rear cable management arms, use them: strap the cables tightly to the arms and make sure they move without snagging. You need to be able to pull kit out without cables being stressed or catching on anything, and although it's a complete ball-ache and there's no such thing as a cable management arm that works like you want it to, they're all generally good enough as long as you're diligent.

Finally, label both ends of every cable – not right at the end by the plug but three or four inches from the end so you can see the labels easily. You don't have to spend hours doing funky professional labels: clear, handwritten labels on tags that won't fall off are better than funky printed ones that smudge and then fall off in a fortnight.

6. Move or replace?

When you're moving your stuff to the data centre, take the opportunity to consider replacing some of your kit. The average on-premise set-up has equipment of varying ages, and it's definitely worth looking into replacing at least some of the older stuff – for a couple of reasons. First is that if something's really old, you need to consider its chances of surviving the experience of being pulled out, carted across town and shoved in a new rack.

Second, even if you do think it'll survive the trauma of relocation, you need to decide whether you want to have both the downtime of the move and then the downtime of a system replacement soon afterwards. So if you're likely to replace some stuff anyway, think about bringing it forward if you can afford it.

7. Co-ordinating the move

Moving kit to the data centre is a combination of diligence and care. For the really crucial stuff, give strong consideration to employing professional movers – or at the very least use proper padded crates that are designed for IT kit.

Well before the move, test everything in the data centre. If you've been able (as per the last section) to replace some of the kit that's great, because you can install it ahead of time in the data centre and use it to verify power provision, internet connectivity and such like.

Before you unplug anything, label it. Don't unplug anything until you're absolutely convinced that you have whatever information you'll need to put it back together the other end. The easiest call-out fee I ever earned was for a client who'd moved a server cluster to a new data centre and had managed to connect the SCSI cables wrongly on both the storage unit and the server: a two-hour drive, a 20-minute headscratch, a lightbulb moment, four re-plugs, a “rebuild your RAID config from the array” RAID card BIOS command that took half a minute, a two-hour drive home, a relieved client and a hefty bill.

8. Documentation

Document the set-up to death, down to every last detail – such as power and network connections – and store a copy of the docs in the racks as well as keeping them electronically: there's absolutely no harm having a copy taped somewhere convenient in the rack so long as it's not interrupting airflow. Make sure every single device in the rack is labelled clearly and correctly, and I've already mentioned cable labelling. Keep a change log book in the cabinet with a pencil firmly attached, and ensure anyone changing anything updates the book with the detail of what they've done and when. Documentation is your friend, but if you let it get even slightly out of date you're stuffed.

9. Remote monitoring

With your equipment located some way from the office, you'll feel like someone's cut off one of your limbs unless you compensate for your inability to simply wander into the server room and look at stuff. At the very least you need to run up some monitoring tools so you can keep a weather eye on the behaviour of the servers.

I don't mean things like SNMP monitoring of disk space and the like – you should be doing that already in your on-premise set-up, after all. Instead, I mean doing what you can to replace the visual stuff you used to use when the kit was local. Things like LEDs on the front panel, or those little LCD message panels that servers use to show error codes. Most manufacturers provide you with little tools that you can use to interrogate these remotely – but you may need to spend a few quid on the remote management/monitoring adaptors for some servers if you didn't originally buy them. We'll go deeper into remote management and monitoring in a separate piece, as there's a lot to it if you want to do it properly, but make sure you do enough to get a sense of comfort that you can see what's what from afar.

10. Your stash of bits

I mentioned earlier that alongside your own toolkit that you carry around, you should have a stash of useful stuff in the data centre. Some providers let you have a plastic box in the bottom of your rack (assuming there's room) while others will rent you a small locker: whichever's the case, do it. Keep a comprehensive (but not vast) stock of power and network cables of all lengths and colours you might need, and if you can you should also have some spare hard disks and power supplies for the key equipment as these are the two things that blow up most frequently. Oh, and while we're talking about power, make sure you have the four-way power bars I mentioned in the stash, as well as having your own. Ensure you have the serial cable or any other custom configuration cable for every device in your installation, and a nice long cross-over Ethernet cable. Add some cable ties and more rack nuts and bolts and you're just about sorted.

What you've just read is just the start to working in data centres: although they look simple there's a lot to know if you want to do it properly. But take a few tips from these ten points and you'll have learned from quite a few mistakes I've made over the years, and your first data centre installation will be a whole lot easier than it otherwise might. ®

The Register - Independent news and views for the tech community. Part of Situation Publishing