I have a 2u server with redundant 700 watt power supplies, I will be using colocation shortly but I'm confused on how much power I need. Would it about 6 Amps or 12 Amps? Should I add 30% overhead for safety?
Are you sure? It seems there is some meaning to the power supplies rating otherwise why would it exist.. I've seen some companies such as Hp use components to calculate power. I found another post that says "Calculation should be done for one Power Supply with maximum load", is that right?
The rating simply states the maximum capabilities of the power supply. Leaving ~20% headroom, that means you can power components that use up to about 550W, or 4.6A at 120V. Most servers use about 1-2A.
That makes sense. I spoke with a friend who has a data center and he pretty much said tell them it draws 2 amps maybe 3 on boot. Said he charges $12 an amp. Anyways, thanks for your help, I would have lost a lot of money if I didn't ask.
So really old... it will cost you more to run every month then it probably cost to buy it.
Thats going to indeed use about 2.5Amps+ in power.
You would get better (like 10x+) performance out of a new $600-$800 server which uses under 1A, and your colo bill will be less.
EasyDCIM.com - DataCenter Infrastructure Management - HELLO DEDICATED SERVER & COLO PROVIDERS! - Reach Me: [email protected] Bandwidth Billing | Inventory & Asset Management | Server Control
Order Forms | Reboots | IPMI Control | IP Management | Reverse&Forward DNS | Rack Management