Posted
by
timothy
on Tuesday May 29, 2012 @10:13AM
from the with-compound-interest-you'll-die-rich-in-nazeroth dept.

New submitter MBAFK writes "My coworker Geoff and I have been taking power meters home to see what the true cost of PC gaming is. Not just the outlay for hardware and software, but what the day-to-day costs really are. If you assume a 20 hour a week habit, and using $0.11 a KWH, actually playing costs Geoff $30.83 a year. If Geoff turns his PC off when he is not using it, he could save $66 a year."

I'm not sure how this has anything to do with the cost of PC gaming, considering that my mother, who only uses her computer for Facebook and TurboTax, could see the exact same benefits by doing the exact same things the article suggests.

It really would have been helpful to know what hardware you tested on. I get that the CPU and GPU both likely downclocked when idle but Does it have a HD that spins slower (WD greendrive and friends) when not in use? Does it have an SSD card? What monitor did it have? Was it an LED backlight or one of the older ones that use more power?

I also can't imagine anyone not setting their monitor to power off when idle.

I play computer games once in a while so I still need those high end cards (not SLI/Crossfire). I really wished there was a way to make those video cards to act low power used by idling most of its features when not needed. Sure, I can swap the hardwares, but that's annoying.

Except the only person that "needs" that ultra high end graphics card is someone looking for ePeen bragging rights as most games are console ports and won't even stress 4 year old cards. I should know as me and my boys play on $50 HD4850s and the games look awesome and never skip at the native 1600x900 of our monitors.In fact the newer cards for the most part have been seeing the heat and power go down and not up for everything but the ePeen cards, thanks to die shrinks and better designs.

So I'm sorry friend but there really isn't a point in ePeen cards unless you are just going for bragging rights or are doing serious GPGPU work because the games just ain't stressing the systems that hard.

This page [tomshardware.com] has benchmarks for that card with modern games. The 4850 seems to average 30-40 fps in most games at 1680*1050 (Crysis 2 was worse), the benchmarks there don't show a minimum (which is usually about half the average). That's a bit crappy.

While if you have a larger monitor res (which the article we had recently had the most popular sizes being 1366, 1600, and 1920 ) then you will need a slightly beefier card but again those can be had for less than $100 US and again don't need the huge PSUs and insane power draws of something like the 79xx cards.

So the person i was responding to was simply wrong when he said you'd need 500w just for the graphics cards to play modern games. I've been playing tons of games and they look just fine and so far

I didn't measure differentials for optical disc activity (DVD burner was idle when testing) or for high levels of disc activity (disc was spinning, but not being actively used during testing) but the thing that stands out to me is that the background power usage of this machine is larger than the differential caused by CPU utilization.

There are definitely some offenders out there; but contemporary GPUs are increasingly good at cutting back when they aren't needed. Laptop OEMs won't touch an architecture if it will utterly toast the battery just to dump the desktop to the screen, and desktop cards(while their maximum draw seems to edge ever upward) have inherited a similarly parsimonious lower end.

Sleep on a modern machine is pretty damn good. On my main gaming PC if you turn off the monitor and sleep the system it uses 3.18 watts. If you turn the machine off rather than sleep you use 2.92 watts.

Sure, a high-end gaming system is going to draw more power even when idle than a crappy underpowered and outdated system (like my mom's)

Current generation of video cards, even the high end, draw maybe 2-3 watts when idle. The Ivy Bridge CPUs are 77 Watt TDP rated and idle they consume peanuts. With a good PSU and SSD, I seriously doubt these systems will draw more power idle than a crappy outdated system.

I'd wager they'll draw far less - the demand for long battery life in the mobile space has made its way into desktop hardware. It's amazing what the engineers can come up with when they actually try to reduce power consumption. Today's crappy outdated system might have been a high-spec machine five years ago; unfortunately, that was near the peak of our "throw more electricity at the problem" phase in hardware design (Prescott, anyone?).

Not to mention that is just stupid if you are on Win 7. With hybrid sleep everything is off but a small amount of power to keep the RAM alive and when you hit the button you are back up in less than 10 seconds, at least on desktop. When you figure in the amount of power one uses to go through a full boot you are penny wise and pound foolish to go for a full shutdown anymore.

So even PC gamers would be worse off to follow this "advice" instead of simply putting the unit to sleep when not in use. As an adde

If your mother only uses her computer for Facebook and Turbotax but draws 100W while idle, then your mother needs building advice. Nudge her into moving to Ivy Bridge Core i3 (and use the integrated graphics; don't add graphics card) when they come out in a couple months.

(Actually if that's all she does, maybe even an Atom or Bobcat system will be enough, but in 2012 I don't recommend going that way.)

Yeah every now and then Slashdot has these silly articles about PC power consumption, "kill a watt" etc.

The power consumption of modern PCs (post P4) has gone down to a level where most home users would usually be better off looking for savings in other areas. Driving more efficiently, not using as much cooling/heating (and making it more efficient - insulation, sealing etc).

As for gaming, sure a high powered gaming rig will use a few hundred watts (and usually less if you're not doing SLI). But that's far from the most energy hungry way of having fun. Your hobby could be drag racing, or hiking/rock climbing somewhere that requires a 1 hour drive, or even baking cakes. FWIW even cycling and other sports might be more energy hungry if you replace the calories burnt by eating more of stuff that requires a fair bit of energy to produce ( e.g. US corn fed beef).

So if all that exercise makes you eat an additional half pound of beef (400kcal), that's about the equivalent of running a 300W gaming rig + monitor for 9 to 10 hours.

In contrast 1 pound of chicken = 1.1 pounds of CO2.

I've even seen many people here who say they still prefer to use incandescent lighting. It doesn't take that many bulbs to use as much as a gaming rig, even fewer for a facebook/browsing PC/notebook. A single fluorescent tube lamp uses about 40W already.

I've even seen many people here who say they still prefer to use incandescent lighting. It doesn't take that many bulbs to use as much as a gaming rig, even fewer for a facebook/browsing PC/notebook.

People who refuse to use CFLs because "the color's not right*" or "it takes too long to start up" aren't the kind of folks who are worried about electric bills or global warming. Also, the flourescents are far cooler, so your AC costs drop with them.

Yeah every now and then Slashdot has these silly articles about PC power consumption, "kill a watt" etc.

Still, a lot of people still haven't got the message "If you turn shit off when you're not using it, your power bills go down". That seemed to be the overriding message behind the summary. This goes for a lot of things, TV, aircon/heater, lights and what not. The only device I have on 24/7 in my house is the fridge. Some people are actually surprised I don't have a A$1000 per quarter power bill (I pay about A$80-120 per quarter at A$0.22 ish per KW).

All depends where you live, electricity prices seem to vary massively across america and presumablly even more arround the world.

Further complicating matters your local climate, building design and heating or cooling systems affect the real cost of indoor electricity usage for you. If you live in a cold climate and use resistive electric heating then running your computers is effectively free because it just displaces heating. OTOH if you live in a hot climate where you are running aircon all the time then

why not grab a kindle fire show her how to use it and realize it uses less TEI (total environmental impact) for way less than said $500 system (though i3s sell at walmart for roughly $375) if she can handle a bw eink the DX has unsurpased screen size and lifetime 3g for whispernet, though it costs you as much as a wal mart pc, and unlike the fire is not in color (yet, color eink has been POCed) i've heard that rooted kindle last 8 hours a charge, and if left unrooted last 1-2 months with wifi/3g disabled r

Not to mention you have to figure in the long tail of the hardware, well that is as long as you don't buy OEM crap. My gaming PC from 2001? Still running, owned by a little checkout girl who uses it as a nettop. My gaming PC from 06? Still running, in fact the guy that got it still games with it, mainly flight sims. Hell the only reason I upgraded from my 3 year old AMD quad is I found a killer deal on a hexacore and my youngest was bitching about his dual so I figured i'd kill two birds with one stone.

Back when BTC were above $8, and you were using modern Radeon cards, it was roughly break even. Now if this is in a room that needed to be air conditioned, I would ballpark triple the energy costs. I decided it wasn't worth it unless it was the winter.

I ran a kill-a-watt test recently. It costs about $5.50/month to run the PC idle, $13.50 a month to run a miner with an ASUS EAH6850 graphics card. I mine at about 230 Mhash/sec which makes about $22.63/month at current difficulty and exchange rate.

What about switching out power hungry gaming cards for newer, more efficient cards? This year's mid-end model may have comparable performance to last year's mid-high end model but might draw half the power. Over time, the lower power consumption adds up, not to mention you can get by with a smaller power supply. Likewise, trading in your hard drives for a solid state drive (maybe using a green HDD for extra storage)? And for old timers, switching out CRTs for LCDs? Overall, I think it'd be easier for people to upgrade to more energy efficient components than it would be for them to change their PC usage habits. Lowering the sleep/HDD shutoff/monitor shutoff timers can make a big difference too without having to remember to shut down your PC every day or waiting for it to reboot. Not an option for everyone, but gamers usually aren't on a shoe-string budget or else they wouldn't be able to afford the PC and the games in the first place.

..you'd be playing for couple of years to justify the cost of upgrading just for that reason.

the whole debate is stupid, time spent (presumably happily) / money for electricity ratio is pretty much nothing if compared to just about any hobby, hell, even just buying sneakers is more expensive per year.

not to mention the energy costs acquired when the equipment was made.

just buy a phone and play with it? uses much less energy. the games suck though.

This doesn't work for the same reason that virtualization rarely yields absolute savings. Instead of "doing the same with less", the pointy heads see all this newly-freed up hardware and decide to re-use it. You end up "doing even more with the same". So your costs-per-work-unit go down, but your absolute costs stay the same (or go up once virtualization costs are factored in).

The same goes for people buying hardware. We rarely say "oh, I can buy this computer that has A) the same performance and B) bet

It really depends on the situation. For example, I build packages for my open source project. The computer science department donated 20 machines for use in a cluster while I was there. I could build around 2000 packages in 10 days. After I left the university, I had to do it with my own computing equipment. Today, I can build the same software in about 2 days with my desktop computer. If I were paying for electricity use to run 20 Dell optiplex systems with pentium 4 1.7Ghz-2.0Ghz + IDE disks to the

I don't think the parent is suggesting that you buy components to replace fully functioning and useful parts just to save electricity. Potentially, though, you could save real, actual money buy buying newer parts than upgrading your current, old hardware.

I ran an 8800GTX until it died, but it was around 6 months ago and I decided I needed an upgrade (before it failed). If I had gone ahead with the upgrade, I would have paid Â£100 for the card, and another for a 1kW PSU to handle the draw. Those

Yes, I was referring to regular upgrades you might do anyway. For example, the Radeon HD 7850 (this year's mid-end model) and the 6950 (last year's mid-high end model) have comparable performance, but the 7850 draws about 2/3rds the power or less depending on benchmarks. The 6950 sells for less, but the power consumption may make the total cost of ownership similar to or greater than the 7850.

Even with a lousy HDD-of-no-particular-importance, I find that the big timesuck on boot isn't the booting; but the "getting all the browser pages and documents and whatnot back to where I left them(yes, even in applications that support session restore, you still run into issues like webpages that have decided to nuke the contents of form fields and such)" problem.

For that reason alone, the only real choice is between suspend-to-RAM and suspend-to-disk. With your contemporary soft-off PSU burning a few w

Wow, earth shattering news here, turning off your PC when your not using it saves you a significant amount of money! What about factoring in cooling costs. High end gaming machines put out a lot of heat too. Since many gamers are using SSD's these days, sleeping your computer is great, they resume so fast. It's just common sense. I make sure everyone in my house shuts down or sleeps their machines at night if there is not a valid reason why they are on. It really does help.
The real problem with this

Especially when you factor in that the gaming will keep you away from other hobbies that might be more expensive. Such as: RC airplanes/cars, porn, collecting items, cars, girls (plus you don't need to worry about having kids, which cost even more money!), along with many other things.

Another advantage (in Windows, Linux doesn't have this problem) is that when you boot the machine, you have to restart every application. I don't mind booting my Linux box, but I HATE booting Windows. Ironically, I almost never have to boot the Linux box but am forced by its updates to boot the Windows box.

I have always seen these types of problems with the so-called "hibernate" or "sleep" modes. I always disable this feature the first chance I get. The ridiculous amount of time required for the rebooting process hasn't improved much since Windows 3.1. The more software you use on a daily basis the worse your problem gets. Let's say to like to keep track of your schedule with a PC based organizer. If during any particular weekend day you only need to update your schedule four different times and random i

My laptop (Sony Vaio Z) cold boots into Windows 7 in ~9 seconds (fresh install of Windows 7 minus crapware helps a lot). But to be honest I keep my laptop on 24/7 when it's on AC power and use sleep mode whenever I pack it up to bring it somewhere else. I seldom if ever turn it off.

I think a lot depends on how fast the DHCP server assigns the computer an address when it wakes up and requests one. At home it's nearly instant, but at work it can take something like 15 seconds to obtain an address, meanwhile things that depend on network connectivity start freaking out.

True costs - where is the vitamin d deficiency, light sensitivity, prices for bawls and redbull, price for pizza, radon exposure from your mom's basement,depends for long raid nights, divorce costs, hardware costs and software licensing and general lowering of testosterone levels.
Of course the benefits are, water savings because of less baths, no social costs (coffee shops, movies, dates, video rentals, vacations, etc), not expensive presents for friends, less electricity used in the house because no ot

I would suspect C3 sleep states are supported on a majority of systems by now. Perhaps I was just lucky when I picked up the hackintosh board a few years ago. Now, I simply use a reasonably long idle timer and the system goes to sleep/power off. It takes a few seconds to come back out of that state and wholly beats a cold start.

I guestimate my home system gets about 3-4 hours of usage each day during the weekday. In addition, there are plenty of other device around the house which support other core service

All in all, that is really peanuts in terms of electicity bills. If you are spending roughly 2 hours a day gaming, a normal person with a full-time job and a family would have very little time to do much else that can sink money.

Considering that yearly electricty bills routinely reach about a $1000+ for a standard household [eia.gov], this added 10% due to gaming is pretty insignificant when compared to other hobbies...like racing cars for example.

Sure, there may be cheaper hobbies, but I honestly don't think anyone well-settled enough to be practising a daily hobby and deriving enjoyment from it finds it a problem to spend 8 bucks 50 cents a month for their recreation.

In case you hadn't figured it out, some governments subsidize suburbs as part of national policy. Originally it probably was for the following reasons:

1. Keep economy going full steam. New houses, people need stuff for houses.

2. Prevent urban unrest. After seeing Paris, will they want to go back to the tenements? There had been social unrest after wars before, with returning soldiers returning to lives that were economically worse than army pay. The government knew this.

I'm not sure what exactly the article is trying to convey here, as measuring electrical consumption is merely fine-tuning an existing expense related to a hobby, and an obscenely small amount of money being measured at that (c'mon, ~$30/year? People who will spend twice that much in a month on caffeine just to play said hobby).

Compare playing video games to spending money on cable TV. Or going to the movies. Or riding a bike outside. Discussing literally pennies of electrical savings per day seems rather pointless when you're spending considerably more to sustain that kind of hobby in the first place.

As of my last month's bill I am paying 28.8 cents per kWh. I'm not sure how much power my computer uses, but with my Nvidia GTX280 and an overclocked 4 Ghz dual core CPU I would assume at least 400 watts. Particularly while playing a game. So let's say 12 hours for a day of gaming. So 4.8 kWh or $1.38 per day of marathon gaming. If you assume 4 days per week that would be $22.12 per month or $265.42. Of course my computer may actually use 500 or 600 watts while gaming. What interests me more is how much pow

Hey, there is the cost of dead components from heat-related deaths, both in terms of your pocketbook and to the environment. Of course, by turning the PC off when you're not using it helps keep it from getting clogged up with dust so fast.

You're working on one of the smallest possible incremental changes in your house's electrical usage. What's the point?

The wall warts (AC adapters) scattered about your house almost certainly use and waste more electricity than your PC. The US EPA guesstimated in 2005 that around 200 gigawatts (6% of US total power) goes through these things, and a significant portion of that (30 - 50%) is wasted.

40 minute long hot showers also costs a lot on the water and gas bills.

In fact, heating water is one of the more expensive things in energy terms (water has quite a large thermal capacity, after all). A quick back-of-an-envelope calculation leads to the cost of a 40 minute shower as being somewhere in the region of 10-12 kWh. (Standard US shower flow rate is 2.5 gallons per minute, and assuming that you're looking to raise the water temperature by around 50F.)

If he weren't interested in gaming he could likely make do with a much less powerful GPU and/or possibly a more power-efficient CPU. The combination of those two would reduce his power consumption even further during non-gaming-related computer usage (or idling).

To really figure the electrical cost of gaming, you have to figure out what else people would be doing if they weren't playing games. Some activities, like watching TV, would use as much or more power.

My guess is if we calculated the energy use of those other activities, gaming might be a net energy saving activity.

If you are playing PC games the lights all over the house may be turned off. If you were not playing PC games then you might be moving around the house with the lights on. Likewise in winter your heating from the game is just heating your house. Even better it's heating the room you are in, so you can let the house be more cool. If you were not gaming perhaps you would be driving your car somewhere, like your girl friends house, and using gasoline. It could be that gaming saves you money over alternative activities in terms of electricity.

Everyone always has a right to complain, but some people's complaints are silly and make me think they're idiots, or to put it nicely, their personality is generously infused with irony.

I can't say whether or not you're an idiot, though, because you merely said "too high" rather than explaining why you think your rates are "too high" -- you might have good reasons which expose corruption in your state's PRC, or you might have amazingly stupid and arrogant reasons, based on arbitrarily saying things without

Let X be the cost of normal, non-gaming usageX + $30.83 = cost of gaming 20 hours a week in addition to (or in place of?) normal usageX - $66.66 = cost of non-gaming usage if you shut down the PC when not using it

But what is his time worth? If I value myself at the same rate my employer values me, then the startup time of my computer costs about 15 cents. I use the PC in the morning before work and in the evening after work and throughout the day on weekends, so that's 30 c

As a ballpark, for most regions I find calculating the yearly cost of an item on 24/7 to be about $1/watt.

Rephrased, at 8.76 cents per kilowatt hour, one watt year costs about a buck per year. Plus or minus leap years and leap seconds. After endless add on taxes, and fees, and fees disguised as taxes, and taxes disguised as fees, that's probably about what I'm paying when I write a check.

I have a meter as well; one thing to consider with replacement appliances is the reliability and longevity of the appliance.

I have a 33 year old Sub Zero built-in refrigerator in my new house. It's so old that it has only one knob for temperature adjustment, and the refrigerator compartment on top is slave to the freezer setting. I've removed the cover to the compressor and coils to clean them, and I've found some indication that a service or two have been performed over the years, but compared to a friend's brand new LG unit that's had to be serviced twice in eight months and had cost them $1600 to purchase, I'm happy to use this fridge for the moment. Plus, a new built-in refrigerator will cost between $4000 and $8000 depending on what brand and features are chosen. This unit can run for a very, very long time for $4000 worth of electricity.

As for TVs, one doesn't necessarily have to use the fancy, big TV all of the time either. For many years I had a projector screen that could roll down in front of the entertainment center, blocking the 27" TV in it, so I could use my projector when I wanted to watch something of substance. Now, I have the projector in a different room from the TV we watch the news on, and we only use it when we actually want to watch a movie or some other thing where surround sound and a big image matter. Obviously the roll-down method won't work with a fixed TV, but putting the fancy home theatre TV into a different room would.

My current PC (an old Dual-Xeon box) has a hardware sleep switch that ties into some pins on the motherboard, and when pressed the computer drops down to a low power state. When I'm done using it I just put it to sleep, and when I want to use it again it comes back in about three seconds. Works well, keeps all of my programs running fine, and saves power.

There are lots of techniques that can be used to save power, but the biggest hogs in the house (HVAC, hot water heater, refrigerator, oven/range/cooktop) don't hold a candle to the consumer devices that everyone always panics about. If you want the most bang for your buck, insulate your house. Change your windows. Plant some trees that increase shade on the structure. Turn your thermostat up a couple of degrees and install some high efficiency ceiling fans to keep the air moving a little. Sure, turn off the electronics you're not using, but don't assume that it'll be earth-shattering on your power bills just by doing that.

I may switch to LED bulbs at some point, but right now they are incompatible with the dual-brite fixtures outside (they'd basically never dim) and with the failures I've seen in supposedly-lifetime CFLs I'm waiting until there's some installed userbase history to know what to buy and what to avoid. They're SO expensive relative to incandescents that I can't justify the cost to purchase them until I know they'll last long enough to recoup the investment. Having been bitten by CFLs once, I'm not going to ge

I've been installing only two brands of LED:Phillips and Sylvania, both have long guarantees (based on a date stamp on the base) IIRC it's 5 and 7 years.So far I have had one failure on a bulb that was installed over a year ago and when I went to submit an RMA I got a response back in one day confirming shipment of the replacement (a newer model too:) without me needing to do anything. (they take the serial # on the bulb).I asked if they wanted it back to the e-mail and they said the first gen bulbs were n

That "paltry $100 a year in savings" would buy me nearly a year's supply of toilet paper. Good stuff, not the cheap garbage. The rest of your post is just stupidity personified for the sake of looking superior or clever, I can't tell which, nor do I care, since you fail at both, dumbass.

My home has on average 100 watts of power available. I can use more in the short term, but doing so depletes the battery and means I'll have to use much less for some part of the week. The wind turbine which is my sole source of power is rated at 750 watts, but only generates that much in absolutely perfect conditions. So I know quite a bit about how to use power economically. I can light my whole house effectively with just 18 watts of LEDs. They're strategically placed, yes - but you can easily read more or less everywhere.

In this situation, the graphics card on my computer (Radeon HD 6850 at 127 watts TDP) is actually the biggest power drain I've got. Obviously, my gaming is limited to two or three hours a day... Power is worth thinking about.

Seriously, you live in a very odd situation. While I'm not against conservation, and indeed I do turn my PC off when I'm not home because why use what isn't needed, you can't try and use your situation to apply to the population at large. 100 watts is NOT something I have to think about. My house has about 15,000 watts of power available to it at all times. 100 watts more or less is not noticeable and is well within the margin of error I get depending on how the AC is run.

The only thing that is gas in my house is the stove. With 3 full size PC's (700W+) running 24/7/365, and heating/cooling the house with central air/heat 24/7/365 my electric bill is usually between $100-180. I'd be interested to know how it compares to someone using gas to heat and cool. I have a feeling that if you took out my PC's and the central air, my bill would be around $5 a month lol

I use gas for heat and the stove. Everything else is electricity, including the water heater. I have individual A/C units. Every bulb in the house, except for a handful of dimmable bulbs is CFL. My monthly bill, being in an expensive part of the country, is always within spitting distance of $200 and fairly consistent regardless of season. My gas bill, however, goes from roughly $120 in the winter to $30 in the summer.

A year or two ago I looked at energy consumption on most of my appliances and electronics.

There's nothing I can do about it. The fun thing about grieving is when you reach the "acceptance" stage, well, you accept it. Thinking about all the things I "could have done" leads absolutely nowhere except depression. On the other hand, there's a lot of things I can still do!