Results 1 to 13 of 13
  1. #1
    Join Date
    Dec 2004
    Location
    San Francisco, CA
    Posts
    1,912

    Does 240 volt generate lesser heat?

    Yes, I know that 240 volt should technically generate lesser heat than 120 volt according to physics and how electricity work. (lesser current passing through same resistance, lesser heat generated).

    However I was talking to someone who pointed out that power is stepped down on the motherboard. hence really whether the server is running at 120 or 240 shouldn't matter. However the PSU has to step it down, does it generate more heat stepping down from 110 than 240 or is it the other way around?

    what is your take on servers running on 240 volts and the amount of heat they generate?

  2. #2
    Join Date
    Dec 2004
    Location
    San Francisco, CA
    Posts
    1,912
    sorry i mean 110V and 220V
    not 120 and 240

  3. #3
    Join Date
    Jun 2002
    Location
    PA, USA
    Posts
    5,143
    Volt and heat are two different quantities. Heat is a measure of energy, in unit of Watts. Volts is Energy per unit current.

    And no, higher voltage does not imply lower heat, and vice versa.

    P = V I


    what is your take on servers running on 240 volts and the amount of heat they generate?
    Makes no difference. Consider your server power supply, which is rated at, say, 220 Watts. If this is being used on 220 Volts circuit, then the current drawn by the power supply will be at most 1 Amps. On the other hand, if you plug the power supply to 110 V circuit, the maximum current drawn will be 2 Amps. In both cases, the maximum heat generated (assuming all is turned into heat) will be 220 Watts, regardless of what the input voltage.
    Fluid Hosting, LLC - Enterprise Cloud Infrastructure: Cloud Shared and Reseller, Cloud VPS, and Cloud Hybrid Server

  4. #4
    Join Date
    Jun 2002
    Location
    Waco, TX
    Posts
    5,623
    Actually 220 has a better sine wave which provides cleaner power, which causes less heat in some cases, the question is does it in a server power supply? Server class PSUs are pretty effective but still not 100%, I believe it should be a bit cooler even APC says so:
    http://www.apcmedia.com/salestools/S...NQZ7_R1_EN.pdf

    I have seen it in practice as well, when the sine wave is cleaner the components have to work less therefore making it more efficient which translates to less heat. While watts are the direct relation the way the power comes in is also important and 110 is simply inefficient power.

  5. #5
    Join Date
    Dec 2004
    Location
    San Francisco, CA
    Posts
    1,912
    Quote Originally Posted by FHDave
    Makes no difference. Consider your server power supply, which is rated at, say, 220 Watts. If this is being used on 220 Volts circuit, then the current drawn by the power supply will be at most 1 Amps. On the other hand, if you plug the power supply to 110 V circuit, the maximum current drawn will be 2 Amps. In both cases, the maximum heat generated (assuming all is turned into heat) will be 220 Watts, regardless of what the input voltage.
    Heat = I*I*R*T
    lesser current, lesser heat.

  6. #6
    Join Date
    Nov 2002
    Posts
    2,780
    I know in power transmission, the lesser the current, the lesser electrons move and thus less heat. That's why they step the voltage up to 750KVa or so for power transmission. Not sure if this would apply to servers.
    http://Ethr.net jay@ethr.net
    West Coast AT&T / Level3 / Savvis Bandwidth, Colocation, Dedicated Server, Managed IP Service, Hardware Load Balancing Service, Transport Service, 365 Main St, SFO / 200 Paul Ave, SFO / PAIX, PAO / Market Post Tower, 55 S. Market, SJC / 11 Great Oaks, Equinix, SJC

  7. #7
    Join Date
    Sep 2005
    Location
    Airdrie, Alberta, Canada
    Posts
    197
    The short answer is there is less heat overal but it's neglegable. If the load is constant, a higher voltage will require less current to do the same work. Where's the benefit of 240? Lower power bills. As for power transmission, voltages are bumped up to these levels to keep the current down. Ever wondered why power transmission cables are smaller in diameter than most residential feeders? Lower current = smaller wire size = $$ savings. It also allows for oversizing of the conductors to compensate for line loss.
    Dan Bulmer
    CRUSE Hosting Services
    http://www.crusehosting.com
    Full H-SPhere Clustered Servers

  8. #8
    Join Date
    Apr 2005
    Location
    Jacksonville, FL
    Posts
    981
    Everyone seems to have taken a position on this one way or the other; not me. The technical explanation is that whether or not less heat is generated is totally dependent on your load, end of story.

    What we're talking about is power supply efficiency. Your power supply has resistive and reactive current draw; resistance = heat, reactance = storing/releasing current. Reactive loads don't generate any heat. So in order for your power supply to generate less heat, it simply has to become less resistive. Resistance affects current linearly; measure your input current and multiply it by your input voltage in each scenario and you have your heat answer (in watts).

    The reason I'm not going to state an opinion one way or the other is because I know how intricate PC power supplies have become. In some cases you simply won't find a transformer inside; everything is controlled by integrated circuits. Although one might rightfully state that the analog components in power supplies are going to behave the same regardless of input voltage, there are still too many variables in the equation to even hypothesize.

    If you do any testing, please report back and let us know what you find

    Now selling BigVPS's!
    Jacksonville Colocation and dedicated servers by colo4jax
    We are *not* a reseller. We own our servers, switches, routers and racks.

  9. #9
    Join Date
    Jun 2002
    Location
    PA, USA
    Posts
    5,143
    Quote Originally Posted by Yash-JH
    Heat = I*I*R*T
    lesser current, lesser heat.
    OK ... so ...

    I = V/R

    Total Heat = H = [V*V/R]*t

    So are you now saying, higher voltage generate more heat?

    I and V are related. What really matters is P (which is the product of V*I). The CPU, for example, may require 95 Watts of power. And you can power it up using whatever voltage; the bigger the voltage, the smaller the current. The end product is constant, 95 Watts.
    Fluid Hosting, LLC - Enterprise Cloud Infrastructure: Cloud Shared and Reseller, Cloud VPS, and Cloud Hybrid Server

  10. #10
    Join Date
    Jun 2002
    Location
    PA, USA
    Posts
    5,143
    Also, if voltage matters, then you should start to wander, why that matters. Irregardless of the input voltage to the power supply, the power supply will convert it to low voltage DC (12V). This is the operating voltage for your servers and that's what matter. So now, you wonder, does lower voltage actually means higher heat generation? If so, perhaps the PC manutfacturer should really start designing server/PC based on 110/220 Volts. That's not gonna happen.
    Fluid Hosting, LLC - Enterprise Cloud Infrastructure: Cloud Shared and Reseller, Cloud VPS, and Cloud Hybrid Server

  11. #11
    Join Date
    Jun 2002
    Location
    PA, USA
    Posts
    5,143
    Quote Originally Posted by (Stephen)
    Actually 220 has a better sine wave which provides cleaner power, which causes less heat in some cases, the question is does it in a server power supply?
    OK, The APC link does not mention this.

    [quote]
    Server class PSUs are pretty effective but still not 100%, I believe it should be a bit cooler even APC says so:
    http://www.apcmedia.com/salestools/S...NQZ7_R1_EN.pdf

    That doc mention that higher voltage will make the wiring devices, fusing, and switches cooler (e.g. the APC wires, the power cables, etc). It makes no mention about the appliances themselves, which what really matters. The heat generated by the appliances is constant, that depends on how many wattage the appliance needs, the maximum being given by your power supply. However, the wiring devises, fuse, and switches (the circuit breaker, the fuse, and the power cables) -- all they way from the power grid to your rack, from your rack to your PDU, from your PDU to your servers -- does indeed run cooler. However, compare to the servers in your rack as a whole this is much negligible.

    As to why it runs coller in the wiring devices, it's because the resistances on these wiring devises (and fusing, switches, breakers, etc) are constant. Therefore, the higher the voltage you pass through the wire, the lower the current. And now, you can see the real advantage of using high voltage for power transmission, as P = I*I*R. This is one of the main reason why the high voltage in power transmission.

    But again, your appliance does not really care about the input voltage. Besides, irregardless of the input voltage, all components in your server, but the power supply, actually is using low DC Voltatage (12 Volts). If the higher the voltage the lower the heat, why would people (after decades) design PC components using low voltage?

    I have seen it in practice as well, when the sine wave is cleaner the components have to work less therefore making it more efficient which translates to less heat. While watts are the direct relation the way the power comes in is also important and 110 is simply inefficient power.
    I have worked with all sorts of voltages, from low DC voltage (micro Volts) to some medium high voltage (KV, DC or AC) all with different sort of currents. I have never seen a quality of the AC sine wave as a function of the output voltage. At the end of the day, two equally good power supplies will produce two equally good AC/DC signals. And you can do a quick calculation, a small impurity in your sine wave will not matter much in terms of power generation. In fact, there is a square wave function that will produce the same (or less) power as a pure sine wave, despite the fact that square wave is so much different than a pure sine wave.

    BTW, I don't see the relevancy of the quality of the AC sine wave to your server. Your server should have a good power supply and the power supply will convert AC to DC. Furthermore, a good power supply will produce good quality DC Voltage (perhaps with less than 0.1% ripple). This DC Voltage is what being used by the servers components (Which what produced the heat).
    Fluid Hosting, LLC - Enterprise Cloud Infrastructure: Cloud Shared and Reseller, Cloud VPS, and Cloud Hybrid Server

  12. #12
    Join Date
    Dec 2001
    Location
    Toronto, Ontario, Canada
    Posts
    6,896
    I'm surprised that nobody has mentioned the fact that on 3 phase power (aka any decent/sizable UPS), you wont be able to generate 220/240V, but instead, 208V due to the manner in which the sin waves intersect (each wave is off by 90 degrees with 3 phase power, unlike the 180 degree offset in 2 phase power, hence the maximum distance between the outer edges of the waves is 208v instead of the potential 240v [easier to understand if you draw it out]).

    Bottom line is, voltage is irrelevant when it comes to the amount of heat based on number of watts consumed (100W of heat is the same at any voltage); You're just asking about the PSU efficiency at this point, which is another matter on its own (just because the PSU is more efficient doesn't mean you're not getting trade offs elsewhere, like the UPS unit providing the conditioned power, other connected gear with different PSU's/different efficiencies, etc.).
    Myles Loosley-Millman - admin@prioritycolo.com
    Priority Colo Inc. - Affordable Colocation & Dedicated Servers.
    Two Canadian facilities serving Toronto & Markham, Ontario
    http://www.prioritycolo.com

  13. #13
    Join Date
    Aug 2006
    Posts
    75
    We will be testing a new (for me at least) DC system for Dell and IBM gear starting Jan 15. I've wanted to build one of these for the entire 11 years I've been in the DC management business, so I'me very excited.

    Initial calculations show that we have a cheaper cost to acquire the power (I'm talking costs apart from the utility here - UPS, etc.), lower heat, and a lower demand per server unit.

    The system looks like this:
    Code:
                       Utility---ATS1------Exisiting AC UPS-----Exisiting AC Load
                                       \
    Generator 1 \               \
                        --Bus Sync \
    Generator 2 /                 \
                                           \
                          Utility-----ATS3-----Rectifier-----Battery Bank-----Bus Duct-----Load
    Each row has a bus duct running above it. Whips are attached directy to the duct for distribution power. Each whip has it's own single pole breaker. Power then goes to a breakout panel (rackmount 4u custom fabricated) and then to the 28 x 1u and 2x2u servers in each rack.

    We have calculated that with the decreased demand per server, decreased btu/hrs from the lack of a static switch and inverter in the UPS, decreased CFM required per server by way of lack of power supply, and the projected few btu/hrs per server based on missing psu, we will see an 18% over all reduction in demand for an indentical AC load.

    Lorraine and Marconi computed a 21-22% savings, however, I still have my doubts. I think there will be a savings on the MRC for power...not to mention the dramatically lower maintenance, installation, and capex, but I don't think the numbers will be this dramatic.

    Obviously the OP was asking about 240v AC, so this is a little OT.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •