Simple Power/Energy Question

Now, I think i know how to do this one, but I'm just not sure if I'm using the correct figures. To figure out how much it would cost to run an appliance for a certain amount of time per month, you would need to know the wattage of the appliance, correct? Does voltage need to be known? For example, a guitar amplifier puts out 120 Watts, and is 250 Volts. I tried to use just the wattage, converted to kW, and multiply by 3600 to get kW/hrs. For some reason, I got outrageous numbers for the amount of $ it costs. Can anybody tell me what I'm doing wrong? By the way, power costs $.075 per kW/hr.

You also need to divide the wattage by 1000 so that you have kilowatts. The voltage doesn't matter, it's already been taken account of in the calculation of the wattage.

As an aside, often devices are labeled with what their maximum power consumption is, i.e. your amp won't be sucking up a full 120 watts if it's just sitting there with the power on but you're not playing anything on your guitar.

Edit: Sorry, you said that you had converted to kW in doing your calculation. I get 0.25 kWh * $0.75 / kWh ~= 19 cents per hour to run your amp at its maximum power consumption. That doesn't sound so bad, I think.

Edit2: Heh, the corrections just keep coming Your problem is likely related to the fact that you're multiplying by 3600 (= number of seconds in an hour, I suppose was your reasoning) to get the kWh. You actually should be multiplying by 1 = number of hours in an hour.

Ok. I see now. So if at maximum output, it's .12 kW, then take that times 60 times 60 again to get hours, i have an answer of 432. So that's in kW/hrs. then taking that times my $.075 will get how much it costs to run my amp for 1 hour. Thanks.