If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Section II:
Power supply capacity calculation. A generally accepted figure for the typical computer power supply unit is 70%. At 95W power consumption at the plug, I am using 7/10 of input power at the output which turns out to be 66.5W. Looks like this computer is only loading this particular 230W PSU to 30% leaving me with 163.5W of unused capacity. I can see how pre-built computers with 140W and 90W power supplies runs fun. If this computer was to be build with 140W PSU, you still have 73.5W of power available. 73.5W is enough for several HDD's, a CD-ROM drive and a graphic accelerator.

My Athlon machine only use 90W at idle. I don't see how an 180W power supply can't be used on it. People always say you need at least 300 and preferebly 350. I don't know what this hype is all about.

My web machine~Econobox 60W~
Power consumption at the plug while web browsing: 60W
It's very quiet.

A power supply generates 12v, 5v, and 3.3v at differant AMPs. The Power Supply is rated on all the wattages added together. Athlons require a power supply rated for 160w between the 5v and 12v (I think) to run stable. Most junk power supplies are difficiant in one of these, usually the 5v. You got lots of power for hard drives and such, but the system still runs like crap.

Extra is always better. The higher the ratting on the Power Supply the less you will strain it. I ran a genaric 250w power supply on my Duron 600@900Mhz (1.725v) 100% stable, but it burned out the power supply after about 3-4 months. It likely overheated and a better fan would have kept it alive for longer, but it came in a case that I only paid $25 for.

Originally posted by marv117 wait...so is IT or is IT NOT necessary to have 300watts of power???

cuz running a 300watt+ psu all day long does a job on our bill.

No...when you have a PSU rated for 600watts, its not always consuming 600watts. It only consumes what your computer needs it to power...usually around 200-250w. Sometimes, much lower when using low-power or low-speed equipment. So having a 300w PSU does not always mean its pulling 300w and putting 300w worth of cash on your electric bill. That just means that it CAN pull that much power IF NEEDED by the computer... hope this was clear enough to understand... PM me if it wasn't

No...when you have a PSU rated for 600watts, its not always consuming 600watts. It only consumes what your computer needs it to power...usually around 200-250w. Sometimes, much lower when using low-power or low-speed equipment. So having a 300w PSU does not always mean its pulling 300w and putting 300w worth of cash on your electric bill. That just means that it CAN pull that much power IF NEEDED by the computer... hope this was clear enough to understand... PM me if it wasn't

The wattage of power supply usually tells maximum total power you can draw from power supply, but sometimes they mean the peak wattage.

Total wattage is less important than so called the "combined power".

Even though it doesn't make sense in English language, "combined power" is the term used to describe the power of3.3V and 5V busses together rather than the whole PSU.

Keep in mind that often time xA@3.3V+xA@5V+xA@12V exceeds rated wattage. If you have lots of 3.3V and 12V load, the chances are, you'll have to back off on 12V somewhat. You can't exceed the maximum current of any given bus, but at the same time you the sum of wattage being drawn must not exceed the total wattage.

A power supply rated at 350W and combined power of 180W means you have 180W of power available from 5V and 3.3V. Other 170W is at 12V, which is predominantly used by drives and fans.

Total wattage has became the accepted way of shopping for a PSU, because it doesn't take much brain to look at. Unless you're running data center with lots of drives, a Delta 300W with 200W of power available at 5V+3.3V is better than 350W total with 180W at 3.3V+5V.

Higher total wattage power supply tend to be able to handle surge current(not to be mixed with line transient), but a properly engineered 200W power supply won't have any problem powering up 99% of consumer computers.

Such 200W PSU you cost more than crappy 430W power supply and it won't be made, because false belief that more wattage=better is too detrimental to its sale.

As for relation between PSU wattage and power consumption, it is rather unrelated. The wattage at plug is about 43% more than actual power used by the computer.

So, if you load a 300W PSU to full 300W, you're taking 430W from the plug.

600W PSU fully loaded means 860W at the plug. High wattage power supplies are made, because people buy them. They will sell anything if it sells. Practicalness doesn't matter.

My Athlon 1.4GHz Tbird ig with two 7200RPM 40GB HDD, two CD-ROM drive and ATI 8MB video card only use about 90W of power from power supply. Add a GF 4, you might use another 40W. I highly doubt that you'll see the average consumer computer that use more than 200W of power from power supply.

My web machine~Econobox 60W~
Power consumption at the plug while web browsing: 60W
It's very quiet.

That said, I'm happy I bought a 430W Antec- not because I need 430W, but because I sure as hell don't want to have another PSU fail on me for lack of capability! It's not all that much more expensive to manufacture a 500W PSU as a 250W PSU; and that's a lot of peace of mind for those of us that do have 7 drives, 12 fans, and every PCI slot used

Actually I've made quite a few power measurements on PC's using only a multimeter, and quite a cheap one at that. The results that I obtained were quite consistent with those posted above by Jerboi (and have also been confirmed using some more sophisticated equipment).

The trick is that the front end of a normal computer PSU feeds straight into a full wave rectifier (see note) and a cheap multimeter only really measures “proportional rectified average” (pi/sqrt(8) times rectified average current) rather than true RMS current. The upshot of this is that if you measure the AC input current with a cheap multimeter and then divide the result by pi/sqrt(8) (approx 1.11) then you have a very good estimate of the average DC current supplied by the PSU front-end full wave rectifier. As you know the DC working voltage of this rectifier (approx 1.4 time the RMS line voltage) then you can deduce the PSU input power to a reasonable approximation.

Before you lampoon this technique remember that even if you use the most accurate equipment on the input power measurements that the figure of 70% efficiency is only a “ball park” figure and it could easily be somewhere between 65% and 80%, so there are inherent approximations no matter which way you do it. BTW I’m not criticizing the 70% figure as I think 70% to 75% is pretty right.. .

Here are the results for the estimated PSU output power when running Windows for two systems I have here at the moment.:

Note : All the PSU’s I use here in Australia have the (240 volt) mains input feeding directly into a fullwave rectifier, this is the topology on which I’ve based my calculations.. Some of our power supplies also have a dual voltage (120/240 volt) selector switch. When these units are switched for 120V operation then the input rectifier is re-configured to function as a capacitor/diode voltage doubler/rectifier circuit. I’m just pointing out that I’ve only used the above technique for the case of a simple rectifier front end and have not analyzed whether it can work in the voltage doubler case.

Actually I've made quite a few power measurements on PC's using only a multimeter, and quite a cheap one at that.

Actually I have done this in the past also and got a pretty good result, but it's quite a pain in the butt. As you've said, the power supply runs on 340V DC. If you're on 240V AC, you simply rectify it and you have 340V(240V x sqrt2). If you're on 120V, the voltage, the voltage is doubled then rectified to yield 340v. (120x2xsqrt2). Voltage multiplier takes peak to peak from AC and it only works if input is AC.

So what I did was, shoot the voltage up to 240V AC using a transformer, then rectify it to 340V DC using a 600V rated bridge and 5,000µF worth of capacitor bank. You need to charge the capacitor through a resistor on initial power up or else you'll blow the fuse or worse, you'll blow the bridge rectifier.

Next, you set your computer to 230V(just in case) and connect the two prongs to 340V DC and connect a multimeter in series. You should place a jumper lead across meter probes, then remove it once connection has been established, so you won't push the inrush current through the meter. If you let the inrush current through your meter, it could cost you an expensive 600V CAT III meter fuse($5-10 each).

The trick is that the front end of a normal computer PSU feeds straight into a full wave rectifier (see note) and a cheap multimeter only really measures “proportional rectified average” (pi/sqrt(8) times rectified average current) rather than true RMS current. The upshot of this is that if you measure the AC input current with a cheap multimeter and then divide the result by pi/sqrt(8) (approx 1.11) then you have a very good estimate of the average DC current supplied by the PSU front-end full wave rectifier. As you know the DC working voltage of this rectifier (approx 1.4 time the RMS line voltage) then you can deduce the PSU input power to a reasonable approximation.

Wow that's new to me. I never really understood how non true-RMS DMM's measured voltage. True RMS ammeter and voltmeter both have a little computer that samples the waveform thounsands of times a second and calculate the true RMS current on the fly. Although, input power factor varies(meaning that waveform varies) depending on the load to an extent and I have a feeling that this method will fail to compensate for that.

Before you lampoon this technique remember that even if you use the most accurate equipment on the input power measurements that the figure of 70% efficiency is only a “ball park” figure and it could easily be somewhere between 65% and 80%, so there are inherent approximations no matter which way you do it. BTW I’m not criticizing the 70% figure as I think 70% to 75% is pretty right.. .

I actually measured efficiency just a few minute ago, but don't count me on it.

I connected a dummy load(a long *** extension cord.. lol!) on 5V line and drew 17.8A from 5V bus and my PSU is rated at 33A on 5V.

The current was 17.80A based on 0.5% +3 digit DMM. I *assumed* voltage was 5.0V, because I only had one DMM and if I remove the DMM from current measurement, it alters the series resistance and varies current, so that was not an option.

That gives output of 89W

Line voltage=122.1V
Line current=1.56A
VA=189VA
power=121W
PF=0.64.

89/121=73.6% efficinecy at ~30% total load all on 5V bus.

During this measurement, power factor was 0.64, therefore input current was 1.58A(true RMS) and 189VA.

One thing to remember about multiple computer loads is wattage is sum of each computer's wattage, but not necessarily so for current.

Power factor is down to 0.64 due to harmonic distortion. 1.5A+1.5A+1.5A+1.5A will be less than 6A, because harmonics from multiple computers tends to cancel out the distortion and increase the power factor. The resulstant current is what really counts when you're trying to squeeze in many computers on the same circuit. You'll have to actually connect all the computers you intend to connect on the same circuit through power analyzer to see what kind of current they're actually putting on line. You shouldn't exceed 12A for computer loads on 15A circuit as computer loads are more or less continuous duty and National Electric Code specifies you can only load circuit to 80% of rated current if continuous use.

My web machine~Econobox 60W~
Power consumption at the plug while web browsing: 60W
It's very quiet.