How does one go about determining how much electricity is used for a particular device? Where can I find the wattage ratings, and how do I calculate the cost based on my current electricity rates found on my monthly bill?

Specifically, I have a computer that I am debating how to leave it on a regular basis. I wonder how much it costs in each of the following scenarios:

So, for example, let's take the worst case shown on the table. The IBM ThinkCentre M52 running Folding@Home with the monitor on pulled down an average of around 175 watts.

175 watts * 24 hours * 31 days = 130.2 kWh

Assuming a local energy cost of $0.15 / kWh would mean that it costs $19.53 per month to run a relatively inefficient computer at 80% CPU usage with the monitor on for 24 hours a day all month.

On the other hand, if the computer is drawing 1 watt while in the "off" state...

1 watt * 24 hours * 31 days = .7 kWh

At $0.15 / kWh, it will cost you approximately $0.11 / month to have your computer off (but still plugged in to the wall).

Of course, you will likely not find your specific setup in a table of power values (and even if you did, you cannot be sure that it will match your specific setup). If you want to find out the real numbers for your setup, you can purchase a power usage meter (e.g. Kill-a-watt). That info, along with your cost per kWh, will give you what you are looking for.

If you are looking for just a generalized answer, then my speculation would be:

I've got a clamp-on Hall Effect Ammeter that I use with an A/C line splitter to do that. If you've got any electrician (or even computer nerd) friends, odds are good that they have one you could borrow. Or, you could pick up one of those "kill-a-watt" type devices at a local store.