Results 1 to 25 of 25
  1. #1
    Join Date
    Mar 2003
    Location
    Chicago
    Posts
    285

    Mass move of servers between two data centers (at night)

    Has anyone pulled off a mass server move during the middle of the night? We have decided 350 E Cermak is where we want to be but are still working out the details of where we want to be in the building.

    While the final details are being finished, I need to come up with some kind of plan to pull the move off with the least amount of downtime. Our current dc is about 4 miles away from 350 Cermak and we have approximately 50 servers to move. I plan on leaving a linux box around to redirect any traffic that is ignoring TTLs.

    I was thinking the best thing to do would be label and remove all harddrives, then see if a moving company could put the servers on palettes, wrap them, and move them to 350 E Cermak. Not sure if a moving company would work in the middle of the night or not. Courier services are pretty much out. Anytime we have had a courier service move anything not in a massive box with padding, it gets bent or broken.

    Any suggestions?
    Scooby2

    Thanks to Karl for showing us around Steadfast on Friday! Karl and his crew run a great shop!

  2. #2
    Join Date
    May 2004
    Location
    Toronto, Canada
    Posts
    5,084
    If you are going to pull drives, make triple sure you label things really well. Spend a lot of time in the planning and hire movers or if it is really that short distance perhaps you just need to get your team together, rent truck(s) and move it yourself. 50 servers you should be able to move in a couple of trips with a truck.

    If I were you I would want to handle them myself with some people I trust. Basically 2 to transport it. 1 to derack one location 1 at the new location to bring things online as they arrive. Label them with tape / labelmaker with ip/hostname/customer and preplan the move machine by machine.

    With 50 I don't think you need to remove hard drives just pack and move in batches of 10 or so.
    André Allen | E: aallen(a)linovus.ca
    Linovus Holdings Inc
    Shared Hosting, Reseller Hosting, VPS, Dedicated Servers & Public Cloud | USA, Canada & UK - 24x7x365 Support

  3. #3
    Join Date
    Jan 2003
    Location
    Chicago, IL
    Posts
    6,889
    We did a mass move, a couple hundred servers, but that was just from the 5th floor of 350 E Cermak to the 2nd floor. We also didn't need to worry about IP renumbering or anything like that, as we were still using our own network, but I think I can offer some tips.

    I would not count on anyone else to deal with the equipment. Move it yourself, as you know you'll be careful with it, and then you can skip removing the hard drives, etc.

    If possible, I'd just get say a dozen in a car or SUV at a time, drive them down, us a flat cart to move them up to where you're going, we can likely let you borrow one of our carts over at 350 E Cermak. Then you can make sure you get those 12 boxes up and running. While those are getting setup you can have someone else taking down and preparing systems back at the initial location.

    Overall, this would assure that customers have the least downtime possible and I always find that the less you're worrying about at a time the lower the risk of a problem. The only downside is you'd need two network setups, but if you have a redundant network configuration that shouldn't be a major issue.

    One note about 350 E Cermak, the loading dock will be closed at night, you'll need to go through the North lobby and you can't take carts out the door there, so you'd need to carry them in, which is also a good reason to not take too many systems at a time. The freight elevator will also be closed, but just tell the security desk and they'll let you use it, some forewarning would likely help though, so they could plan for it and so you wouldn't need to wait too long.
    Karl Zimmerman - Steadfast: Managed Dedicated Servers and Premium Colocation
    karl @ steadfast.net - Sales/Support: 312-602-2689
    Cloud Hosting, Managed Dedicated Servers, Chicago Colocation, and New Jersey Colocation
    Now Open in New Jersey! - Contact us for New Jersey colocation or dedicated servers

  4. #4
    Join Date
    Jan 2004
    Location
    North Yorkshire, UK
    Posts
    4,163
    We frequently move corporate clients from DC to DC, and the first thing I'd say is don't try to do it all at once, do a rolling move of say 5/10 servers every night until they are all moved. Perhaps up your numbers once you've moved the first few and got used to the process.

    Removing the disks from the servers really isn't necessary, if you handle them yourself and do it carefull they will be fine. Just get a big roll of bubblewrap and you'll be fine.

    Before you actually go pull servers from one location and install them at another:

    - Try to make sure you have rails already mounted in the rack, and your servers can be racked as soon as they get to the DC.
    - Make sure your switch configuration is done and your ethernet drops are there waiting to be connected.
    - Likewise for power, do your amperage calculations and get cables ready for each server off the correct PDU ready to plug in.

    Your actual move should then be really fast, all you've got to do is unplug, move, plug back in, and presumably reconfigure your IP's. You can probably do 5 a night single handedly and have the last one back up in under an hour.

    If you've got DNS to change, etc, you'll probably want someone else to do that (even offsite) while you focus on the physical moves and getting the boxes up again with new IP's.

    Dan
    Last edited by dkitchen; 01-06-2008 at 01:58 AM.

  5. #5
    scooby2,

    I did a massive server move for one of the largest ISPs in American and I can only say planning is the key. We did full server move as most of these were sun boxes and the big advice I can give you is give yourself extra time for the unexpected.

    Some others pointers I can give is:
    -Move the stuff yourself as Karl recommended, you care about your customers and some moving company won't
    -Label everything from wires to servers to hard drives to consoles to power, keep it organized
    -Have the new site labeled and racked
    -Add extra staff above and beyond what you think you may need
    -Plan plan plan, every part of the move. Make sure you know which customers get moved when and set deadlines for yourself

    I don't know how your current situation is setup but I'm not a huge fan of hard drive swaps as I've seen firmware issues with raids but that depends on your situation.

    Best of luck to you.

  6. #6
    Join Date
    Feb 2001
    Location
    West Michigan, USA
    Posts
    9,675
    We moved about 80 servers a couple of years ago from one building to the one next door. As others have said, organization and pre-planning is the key. Also, oversee everything yourself. You can tell the moving guys to be careful, but that's really just relative. Careful to you means "don't jostle or set these down hard"...careful to others might mean "don't drop them from higher than 4 ft".

    --Tina
    ||| 99.999% Uptime SLA!!!
    Plenty of space and bandwidth to fit your needs!
    www.AEIandYou.com - - (WP Friendly - Premium Reseller Hosting and Cheap Dedicated Servers)

  7. #7
    Join Date
    Oct 2007
    Location
    Deschutes, OR, USA
    Posts
    163
    I have done this many times, many different ways:

    1. I've moved ~200 servers from the east coast to the west (took 3 days total, ~36 hours downtime - did it over a holiday weekend)

    2. I've moved ~50 servers from the midwest to the west coast, no downtime. Did it all by moving data over the wire onto new servers, then shipping "empty" servers and using them to set up the next ones. A Rolling Upgrade of sorts. Took ~3 months.

    3. I've migrated an entire colocation facility with >1000 servers ~25 miles. The process took about two months of nightly moves of 10-50 servers at a time. LOTS of logistical work and client communications involved beforehand and during. Downtime averaged 1.5 hour per server, all at pre-arranged times.

    4. I've assisted and performed many one-night "cut-and-runs" like you describe. Both of my own machines and helping clients do theirs (in fact just did one last week.)

    ...and I have lots of scars to show for it all. If you have specific questions feel free to contact me offline, I'm happy to share.

    By far the best way is NOT a one-night cut-and-run job. It concentrates all your pain into one night, but there are better ways. If you can manage temporary connectivity between the two places it is best to do it in a phased fashion. As you get better at it, you can move more servers at a time. By the time you get really good at it, you will be done.

    Things to remember:

    * EVERY time you move X number of servers, some %age of them will experience some serious hardware failure and will not just come back online in their new home without some effort. Be prepared for this!

    * Do NOT under-communicate with your clients-users. If you think you'll be able to sneak one by them, you are deluding yourself and risking your business. Tell them everything, tell them often, and keep them very well informed. Google "Alabanza Navisite Migration" for what awaits you if you don't take this VERY critical step.

    * As Karl pointed out, facility-related quirks will fsck up your plan. Do some test runs beforehand, AT THE HOURS you plan on doing the real thing. Figure out how to avoid pitfalls.

    HTH.... & good luck.
    --chuck goolsbee, Prineville, Oregon, USA
    Please note: I no longer work for digital.forest in Seattle, WA, as I left them in early 2010 to pursue an amazing opportunity at an amazing datacenter project elsewhere... I do not speak for digital.forest here. However I still know they provide the best colocation in the Pacific Northwest.

  8. #8
    Join Date
    Jul 2001
    Location
    .INdiana
    Posts
    2,451
    I was reading an old thread about a server move, and the folks were surprised by the datacenter having a limit on the number of people allowed inside. you might check to make sure they can accommodate your group.
    Sneaky Little Hobbitses

  9. #9
    Join Date
    Jan 2003
    Location
    UK
    Posts
    131
    some good points in all of the above. I'd suggest as it's relatively local doing as few as possible in an order structured to provide as little downtime as possible, pre-cable everything if possible, label everything, make sure the DC is ready and all parties are made aware of the plan, ensuring safe transit is more important than fast transit so take your time (ish).

  10. #10
    I agree with many of the points outlined above.

    There are transport firms that specialise in moving electronic systems. Yes, they will work at night. We even had extra trucks on standby in case of a vehicle failure.

    In addition to the points above:

    If you have to do network renumbering, it might be faster to put a *tested* script on each box that can be run *manually* once you fire up at the new location to do all the changes. It is certainly less prone to fat finger mistakes.

    Test the new network in advance.

    Do a GANNT chart schedule, with realistic timelines.

    Give everyone a schedule of their exact jobs and schedule.

    Allow no deviation from the schedule without consultation.

    Ensure that all client dns settings have ttl's suitable for quick propogation in place well in advance of the move. Then, check it yourself.

    BTW, it is possible to do it with next to zero downtime with the right architecture, but at a cost of doubling your bandwidth costs in the short run.
    edgedirector.com
    managed dns global failover and load balance (gslb)
    exactstate.com
    uptime report for webhostingtalk.com

  11. #11
    Join Date
    Mar 2004
    Posts
    550
    I personally did a move of ~30 servers between two datacenters 2 weeks ago. The datacenters were about 5 miles apart. Here's how I did it:

    • We outsource our DNS, so that stayed up during the move.
    • We were moving to newer switches, so the new ones were installed in the new location and working even while the old location was still active. This meant I could plug in a server at the new location and it was up right away.
    • I moved servers 10 at a time, and they were grouped based on purpose or importance. So if there were 5 servers that worked together as part of 1 application (database server, webserver, etc) those were moved as a group
    • Just before moving each server, I changed to IPs on it, then shut it down, then go onto our outsourced DNS and updated the IPs there.
    • Servers containing backups were moved separately from the servers they were backing up. This way, if the van crashed on the way, we'd not have full data loss.


    Using my method above, each server was generally down only as long as the DNS TTL (1 hour) as I could de-rack it, load it up, and have it installed in the new datacenter in that time. I found having the network up at both datacenters helped a lot. Even if you don't have spare switches, you may want to consider buying some basic switches off eBay to use just during the transition so you can be up at both places during the move.

    The thing that I ran into is that I was doing the move during the period 12/24-12/26. The old datacenter was closed during part of this time (due to the holidays). The new datacenter was open, but their loading dock was only open during business hours. I did manage to work with this, but you should check hours of opening for things like this. The new datacenter (Equinix) can keep the loading dock open at other hours, but they charge a remote hands fee ($150/hr) to do so.

    Oh, and be friendly to the security guards. They can do nice stuff like automatically open doors (instead of you doing PIN code + hand scan) when you have a server in your arms

  12. #12
    Join Date
    Nov 2004
    Location
    Atlanta, GA
    Posts
    464
    Plan plan and plan more!

    Some notes.

    - If possible in any way, renumber to your new IP blocks before exiting the old facility.
    - There are companies that do this, they are not cheap to use, but they do it and do it right.
    - Make sure that power is ran correctly and check the breakers in the new datacenter. It is always nice to have an electrician on standby and close. This is what I have seen most problems come from on moves.
    - Each server back it up and do a full power cycle before a few weeks before the move. This way you can check for systems that won't come back online after the hard drives shutdown.
    - If you move the racks full, strap them down to the trailer with at least three times the strapping that you think you will need.
    - Spare screws, zip ties, power cables etc.
    - GOOD cordless drill's that are fully charged.
    - If you can move 5-10 servers at a time and not all of them at once.

    Cheers,

    Linn
    Linn Boyd

  13. #13
    Join Date
    Jan 2008
    Location
    Sweden
    Posts
    169
    Indeed plan as much as you can, I myself moved my rack 1 week ago it was a 20 min ride to the new DC. I started by hanging a pre-configured switch in the new DC to save some time, I also changed the network settings of the servers and one by one unmounted them. I shipped 20 servers at once with a total of 40. Also pre-cabling can save hours, make sure the cables are labelled for swift handling.
    SwedenDedicated Unmanaged & managed solutions.
    Dedicated Servers, VPS and colocation in Sweden - Stockholm.
    100Mbit Unmetered servers
    Follow us on twitter @swedendedicated

  14. #14
    Join Date
    Oct 2005
    Location
    Fleet Street
    Posts
    3,243
    On this note, does anyone have a link to a good, sturdy cart that can move a decent number of servers at a time? We just did a fairly large server move and it was rather difficult to do using dollies.

    Thanks.

  15. #15
    Join Date
    Jan 2003
    Location
    Chicago, IL
    Posts
    6,889
    Quote Originally Posted by avythe View Post
    On this note, does anyone have a link to a good, sturdy cart that can move a decent number of servers at a time? We just did a fairly large server move and it was rather difficult to do using dollies.

    Thanks.
    We get ours from uline: http://www.uline.com/Browse_Listing_1817.asp Make sure you don't get one that is too wide to fit through any of the doors you'll be going through. :-)
    Karl Zimmerman - Steadfast: Managed Dedicated Servers and Premium Colocation
    karl @ steadfast.net - Sales/Support: 312-602-2689
    Cloud Hosting, Managed Dedicated Servers, Chicago Colocation, and New Jersey Colocation
    Now Open in New Jersey! - Contact us for New Jersey colocation or dedicated servers

  16. #16
    Join Date
    Oct 2005
    Location
    Fleet Street
    Posts
    3,243
    We get ours from uline: http://www.uline.com/Browse_Listing_1817.asp Make sure you don't get one that is too wide to fit through any of the doors you'll be going through. :-)
    Those look good, but my concern would be with crossing the street. It looks like the servers would just fall off if you hit a bump (which is easy to do in downtown LA). Do you know of anything that has sides that would hold them in place?

  17. #17
    Join Date
    Jun 2001
    Location
    Denver, CO
    Posts
    3,301
    We have done a number of data center migrations over the years, including a 500 server move into our own data center last Februrary and just before Christmas, we moved 3 cabs of gear for a customer. As others have mentioned, these things were key:

    1. Plan, plan, and plan. If you are moving to a dissimilar configuration (increasing number of racks, decreasing number of racks), know what's going where well in advance. If your servers are not currently labeled front and back, do so before the move. Also color code which server is going into which rack, so they are easier to find at the other end. Put stickers over any NIC ports that aren't in use so you don't accidentally insert your patch cable into the wrong interface.

    2. Do it yourself. Don't entrust the physical move job to an outside company. If you don't have enough physical hands, round up some technically knowledgeable customers / vendors / sub contractors. You will need man power. The actual unracking / loadung / unloading / reracking portion of the move is very physical and the more hands you have the better. Assign two people to unrack gear, two people to load it up, and then have a fifth guy who can oversee everything, pitch in as necessary and make sure things are going as planned. Fifth guy can also be point for updating customers on how the move is going.

    3. If you can manage, spread the move over many nights. We did our big move over 4 separate nights, so we were still moving over 100 servers a night. We had teams of 10-12 each night, however. The average customer had just over 4 hours of downtime.

    4. Automate your IP renumbering. Come up with a script that will automatically update everything (Apache, DNS, interface configurations, etc. Run it before you take the servers down for the move, or put it on a USB drive and run it from the new data center.

    5. Have spares. Extra servers, extra drives, extra fans, etc. Stuff will break, there's no doubt about it.

    6. Come prepared. Bring your own tools (cordless drills, chargers, batteries, long bits, right angled bits, your own moving supplies (tape, foam, bubble wrap, tie down straps). On the other end, if you have to install cage nuts, do so ahead of time.

    7. Communicate with your customers, well in advance. Let the know about the move, let them know when it will be, how long it will be, about the renumbering, etc.

    8. If you are doing a mass move, make sure you keep your people well fed and energized.
    Jay Sudowski // Handy Networks LLC // Co-Founder & CTO
    AS30475 - Level(3), HE, Telia, XO and Cogent. Noction optimized network.
    Offering Dedicated Server and Colocation Hosting from our SSAE 16 SOC 2, Type 2 Certified Data Center.
    Current specials here. Check them out.

  18. #18
    Join Date
    Jan 2003
    Location
    Chicago, IL
    Posts
    6,889
    Quote Originally Posted by avythe View Post
    Those look good, but my concern would be with crossing the street. It looks like the servers would just fall off if you hit a bump (which is easy to do in downtown LA). Do you know of anything that has sides that would hold them in place?
    Nope, I've never needed to transfer servers down the street. I just pick them up at the loading dock, or similar. Get some weight on those carts though and have someone specifically making sure they don't slide off and I doubt you'd have any issues if you stack it properly.
    Karl Zimmerman - Steadfast: Managed Dedicated Servers and Premium Colocation
    karl @ steadfast.net - Sales/Support: 312-602-2689
    Cloud Hosting, Managed Dedicated Servers, Chicago Colocation, and New Jersey Colocation
    Now Open in New Jersey! - Contact us for New Jersey colocation or dedicated servers

  19. #19
    Join Date
    Jun 2001
    Location
    Denver, CO
    Posts
    3,301
    Quote Originally Posted by avythe View Post
    Those look good, but my concern would be with crossing the street. It looks like the servers would just fall off if you hit a bump (which is easy to do in downtown LA). Do you know of anything that has sides that would hold them in place?
    Some automotive tiedown straps should help with that particular problem.
    Jay Sudowski // Handy Networks LLC // Co-Founder & CTO
    AS30475 - Level(3), HE, Telia, XO and Cogent. Noction optimized network.
    Offering Dedicated Server and Colocation Hosting from our SSAE 16 SOC 2, Type 2 Certified Data Center.
    Current specials here. Check them out.

  20. #20
    Join Date
    Jan 2003
    Location
    Chicago, IL
    Posts
    6,889
    Quote Originally Posted by Jay Suds View Post
    6. Come prepared. Bring your own tools (cordless drills, chargers, batteries, long bits, right angled bits, your own moving supplies (tape, foam, bubble wrap, tie down straps). On the other end, if you have to install cage nuts, do so ahead of time.
    Yeah, I forgot about that one, definitely make sure you have all the cage nuts in in advance, and have rails in place in advance, if possible, as well.
    Karl Zimmerman - Steadfast: Managed Dedicated Servers and Premium Colocation
    karl @ steadfast.net - Sales/Support: 312-602-2689
    Cloud Hosting, Managed Dedicated Servers, Chicago Colocation, and New Jersey Colocation
    Now Open in New Jersey! - Contact us for New Jersey colocation or dedicated servers

  21. #21
    Join Date
    Apr 2006
    Location
    Phoenix
    Posts
    808
    Get multiple KVMs etc. Shutting down the servers take alot more time then people plan on. Assume 1 minute per server - thats almost an hour of just shutting down servers.
    Jordan Jacobs | VP, Products|SingleHop| JJ @SingleHop.com
    Managed Dedicated Servers | Bare-Metal Servers | Cloud Services

  22. #22
    Join Date
    Mar 2003
    Location
    Chicago
    Posts
    285
    Thank you for all the tips. I did want to try and layout the racks in something like visio (any open source solutions?). This way i can attempt to map where cage nuts need to go and figure out what length Ethernet cables need to be made/bought. Every thing will not go perfectly but having a plan should help.

    Scooby2

  23. #23
    Join Date
    Jan 2003
    Location
    UK
    Posts
    131
    Quote Originally Posted by scooby2 View Post
    Thank you for all the tips. I did want to try and layout the racks in something like visio (any open source solutions?).

    Scooby2
    pen & paper? or as we did recently excel (1 u per row and it makes moving things around (and re-factoring nuts/cables required when you do make a change) easier).

  24. #24
    Join Date
    Feb 2004
    Posts
    962
    Quote Originally Posted by scooby2 View Post
    Thank you for all the tips. I did want to try and layout the racks in something like visio (any open source solutions?). This way i can attempt to map where cage nuts need to go and figure out what length Ethernet cables need to be made/bought. Every thing will not go perfectly but having a plan should help.

    Scooby2
    Get a ruler, a pencil (pencil, not pen), a few pieces of grid paper, and an xacto knife. Make a scale of one square = 1 floor tile, and then cut out shapes to scale of your racks. Draw the shape of your cagespace on another piece of grid paper, and push the cut out shapes around on that drawing.. Best tools available for this type of layout job.

  25. #25
    Join Date
    Nov 2005
    Location
    Portland, Oregon
    Posts
    1,080
    We did a move awhile back, but not as extensive as yours. We simply moved 2 cabinets of servers into a cage. We ended up having about 45 min of downtime with 4 people moving servers over the other side of the datacenter. Just remember to plan, and label everything. It was much easier for us to put up the servers in the racks according to how we labeled them.

    I wish you the best of luck with the move.
    » VPSFuze.com - Performance should be noticeable - VPS Hosting at its best.
    » HostingFuze.com - Affordable & Reliable Shared & Master Reseller hosting services

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •