Newbie Question GPU/Watts

This may seem like a really stupid question to many of you, I've tried reading about it but still cant seem to grasp the concept. If I have one GPU that needs at least 450 Watts to run correctly, and I want to add another in SLI. Does that mean I need a Power Supply of at least 900 Watts? I've scene people run tandom SLI with like 600-750 Watt PSU? Is a 450 PSU generating a constant 450 Watts? Could someone explain the relationship between power generated by a PSU and the Wattage needed by system components?

The power requirements for the entire computer are included it the PSU recommendation. Consequently, it is necessary to consider only the power requirements of the additional video card when trying to determine how much a second card would require you to upgrade the PSU. The lower number cards such as 8400Gs or 8500GT use maybe 150 to 175 Watts. An educated guess for a second 8800GT would be perhaps require you to go from 450 watts (one card) to 650 to 750 watts.

It is not only the wattage rating of the supply that is important, but more important is how much of that power is actually supplied on the +12V rail, which is where most PSUs draw their power from. This is also influenced by the conditions under which the testing has been performed and whether or not the current rating on all the rails is a peak rating (meaning it is the maximum that can be drawn from the rail before it shuts down, usually for less than a couple of minutes) or a continuous one. Thus, a crappy unit like a Raidmax 600W may have a total +12V current rating of 40A, but this is usually a peak rating that is taken at 25C (most PSUs have an average operating temperature of around 40-50C). So under real-world conditions, the actual rating is way lower.

Ultimately is the quality of the PSU that matters, not how many watts it can supply. A fine example would be the Seasonic-made Corsair 450VX, which can easily put many 500W+ units to shame.

Macsimus said:

Could someone explain the relationship between power generated by a PS and the Wattage needed by system components?

Click to expand...

The PSU only provides as much power as the components need, nothing more. If the draw from the components will exceed the PSU's output limit, it will either shut down or blow, the latter case possibly causing other components to fail due to a current surge.

Thank you for the responses there of great help! Like right now I'm operating on a Rosewill RP550-2 550W ATX12V v2.01 Power Supply with a 8800GT. Since the prices on the 8800 have come way down I was debating getting another one and running in SLI even though the performance increase is only like 50 percent. I was trying to calculate what kind of PSU requirements I needed. So what I guess I'm looking for is a 650 to 750 Watt with high +12V rail. Is it good when it says +12V2, and what about Over Voltage?

All three of these units are active PFC and 80% + efficiency, so they won't be too hard on the electric bill.

You can go to this site, http://www.tweaktown.com/ , click on the visual tab (upper right) and many video card articles are available. Power consumption numbers are toward the end of any article, usually on the next to last page.

Forget the Rosewill, it will probably not handle the extra load well. Also, you will not need a 700W PSU with a lot of amps on the +12V rails. The Corsair 550VX will do fine, with a single +12V rail providing 41A. Get it cheap here.

However, if you want a PSU that will support an upgrade to a pair of faster cards in SLI, get this PSU instead. It will be good for any configuration upto two GTX 260s.