If you are looking for lowest power usage I would start with doing something about having a 1090T running 100% 24/7. Why is it running full tilt all the time? Is there some way you could change your workflow to reduce CPU usage? If not a more efficient CPU is going to save more power than a PSU swap.

the fileserver is pulling about 75w with all drivers spun up. (4w at S3 sleep)the streamer is pulling 180w 24/7 at full workload. I just reduced this from 200w today by underclocking the GPU to minimum, removing the gpu fan for the accelero and removing the PCI soundcard.

Will changing the cpu achieve much? I cannot reduce the workload of the cpu, as the encoding must be done. The problem is, this machine is an AMD setup with DDR2 ram. So changing for an i7 + DDR3 ram + new motherboard is going to be a large expense, which I assume would take 5+ years to recoup in elec costs?

And why do you need to stream video? It might seem like an odd question to ask but why do you need to use 180W all of the time to do this? What does this really achieve? If it is not for some work purpose then why do it?

If a 1090T is good enough then an i5 will be fine as a lower power replacement. No need for an expensive i7. You can also go low-end on the motherboard, an H61 would be fine. You could probably cut you power cost in half or more, 90W or less at full tilt is not out of the question.

However, if your encoding can use quicksync then you could go for an i3 and beat the 1090T in power and speed.

On the video card, do you need it at all? It would be a quick power savings to just yank it. Many motherboards can run headless, or just put a 1-2W cheap(free) PCI card in to get it to boot.

edh is asking the right question, why do you need to stream all the time? If it's just personal use to stream to media devices, get better devices that don't need transcoding.

If a 1090T is good enough then an i5 will be fine as a lower power replacement. No need for an expensive i7. You can also go low-end on the motherboard, an H61 would be fine. You could probably cut you power cost in half or more, 90W or less at full tilt is not out of the question.

However, if your encoding can use quicksync then you could go for an i3 and beat the 1090T in power and speed.

On the video card, do you need it at all? It would be a quick power savings to just yank it. Many motherboards can run headless, or just put a 1-2W cheap(free) PCI card in to get it to boot.

edh is asking the right question, why do you need to stream all the time? If it's just personal use to stream to media devices, get better devices that don't need transcoding.

The stream is an automated hobby- I'm streaming games so it requires the graphics card sadly (though not much power is required - the 5670 is underclocked to the minimum in the ATI control panel..)

There may be the rare task where the 6 cores of the 1090T make it faster than the 4 cores in an i5. In almost all cases the Intel cores are so much faster that even with 2 fewer it still performs better. Even in video encoding which usually takes advantage of multiple cores very well the i5 usually wins. In any case, the i5 is so much better on performance/watt that it will always use less power for a given task.

Maybe a bit more than 90W with an i5 if you absolutely must have the video card. If speed of the video card is not important, then the HDXXXX built into current Intel CPUs will probably do fine and cut power more.

Have you looked into using QuickSync? If your software supports it then you could get away with an i3 and save even more power.

The stream is an automated hobby- I'm streaming games so it requires the graphics card sadly

Have you considered the economics of this?

At 180 watt you will use 1576.8 kWh of electricity per year. Assuming you pay 12.5p per kWh this is £197.10 per year. What job are you in which means that £200 is of so little interest to you that you can burn money like this? If you have children are they clothed and fed or does this vanity project come before that?

Then there is the environmental impact. Let's assume 40% of this electricity is from natural gas, 33% from coal and the remainder is not significantly CO2 producing at point of generation. 296kg of CO2 will be produced from natural gas and 521kg from coal. Together that's the same as you might expect a car to use in a year and you try getting to the seaside in your streaming server...

If everyone is wasting this much electricity in the UK then that equates to 99TWh, around 28% of our national production. We could almost close down every coal fired power station if this much electricity was saved.

If you are streaming this who on earth watches it? Why don't they just get the game themselves? If it's a multiplayer game then can't they join as a spectator to watch instead?

There may be the rare task where the 6 cores of the 1090T make it faster than the 4 cores in an i5. In almost all cases the Intel cores are so much faster that even with 2 fewer it still performs better. Even in video encoding which usually takes advantage of multiple cores very well the i5 usually wins. In any case, the i5 is so much better on performance/watt that it will always use less power for a given task.

Maybe a bit more than 90W with an i5 if you absolutely must have the video card. If speed of the video card is not important, then the HDXXXX built into current Intel CPUs will probably do fine and cut power more.

Have you looked into using QuickSync? If your software supports it then you could get away with an i3 and save even more power.

Sadly quicksync doesn't work with MKV compression on any of the software used :/

Thanks for the advice -> I'm going to get a 35W i5 quadcore. I think it's possible that the embedded gpu can drive the whole thing, so I could potentially be sub 70Wish which should pay for itself in approximately 12 months-ish!

Currently the 5670 gfx card has been underclocked to the lowest settings possible in catalyst, and it still drives the game at 125fps..

Be careful, there is no 35W quad core desktop i5. If you are looking at the 3470T (or older 2390T), that is a dual-core part. Basically an i3 with turbo.

The i5-3570T is the lowest power quad core desktop i5 at 45W TDP. However, Intel's TDP ratings are pretty conservative, even the normal 77W i5s are very unlikely to pull that much with normal workloads unless overclocked. Even a normal i5 is still going to drop your power consumption a great deal.

Be careful, there is no 35W quad core desktop i5. If you are looking at the 3470T (or older 2390T), that is a dual-core part. Basically an i3 with turbo.

The i5-3570T is the lowest power quad core desktop i5 at 45W TDP. However, Intel's TDP ratings are pretty conservative, even the normal 77W i5s are very unlikely to pull that much with normal workloads unless overclocked. Even a normal i5 is still going to drop your power consumption a great deal.

Hmmm, wikipedia says i5 3470T is a quad core processor. I do see that there are four different i5 processors with 3470 - the 'sandy bridge' seems to have 2 cores? How confusing!

I'm looking at the 3570T - it says 4 cores but only 2.3GHZ? Is that really comparable in live MKV encoding to the 1090T which has 6 cores at 3.2GHZ? It would be incredibly if the performance would be similar, and I would DEFINITELY buy this cpu!

Probably worth your time to visit Intel's website instead of wikipedia. For heavily threaded apps, figure the i5-3470T is slower than the i3-3220 (no turbo benefit when you run all threads)...and the i3 encodes slower than the 1090T.

to do you encoding. I know it uses a LOT less power (10W) than any CPU or GPU to encode with. Then use your CPU to transcode the "container"? (I know most of the media servers have this built in, and takes a LOT less CPU power)

At 180 watt you will use 1576.8 kWh of electricity per year. Assuming you pay 12.5p per kWh this is £197.10 per year. What job are you in which means that £200 is of so little interest to you that you can burn money like this? If you have children are they clothed and fed or does this vanity project come before that?

Not everyone is poor. £200 a year is 54p a day. Big whoop.

It's not your job to dictate what people do with their computer, or how they use power. If you don't like it, don't comment.

Who is online

Users browsing this forum: Yahoo [Bot] and 1 guest

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum