Welcome to MilkyWay@home

Power consumption by GPUs

Message boards : Number crunching : Power consumption by GPUs
Message board moderation

To post messages, you must log in.

AuthorMessage
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18071 - Posted: 9 Apr 2009, 20:58:44 UTC

Assuming that I put an AGP-based HD3850 in my AMD system, what would be the difference in power draw as compared to the CPU if the GPU is left running all the time and was being fed with work to do all the time? IOW, is there any power savings to be had by doing that? I can't see that there would be, but I don't have one of those fancy Kill-a-Watt things... Bear in mind my CPU is overclocked and with a relatively high voltage (1.65ish, IIRC)

Thanks...

Brian
ID: 18071 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 18073 - Posted: 9 Apr 2009, 21:06:08 UTC - in response to Message 18071.  

Assuming that I put an AGP-based HD3850 in my AMD system, what would be the difference in power draw as compared to the CPU if the GPU is left running all the time and was being fed with work to do all the time? IOW, is there any power savings to be had by doing that? I can't see that there would be, but I don't have one of those fancy Kill-a-Watt things... Bear in mind my CPU is overclocked and with a relatively high voltage (1.65ish, IIRC)

Thanks...

Brian

A HD3850 will draw about 60 to 70 Watts under load. It will be really hard to get the same flops with a CPU at the same wattage, considering you would need a 4GHz Core i7 or so to get even somehow close.
ID: 18073 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18079 - Posted: 9 Apr 2009, 21:55:33 UTC - in response to Message 18073.  

Assuming that I put an AGP-based HD3850 in my AMD system, what would be the difference in power draw as compared to the CPU if the GPU is left running all the time and was being fed with work to do all the time? IOW, is there any power savings to be had by doing that? I can't see that there would be, but I don't have one of those fancy Kill-a-Watt things... Bear in mind my CPU is overclocked and with a relatively high voltage (1.65ish, IIRC)

Thanks...

Brian

A HD3850 will draw about 60 to 70 Watts under load. It will be really hard to get the same flops with a CPU at the same wattage, considering you would need a 4GHz Core i7 or so to get even somehow close.


So the question is, how much wattage is my processor drawing while under load. I'm not asking this from a flops angle or a make more points angle, but trying to figure out if spending the money for the 3850 would have an ROI in terms of saving on the electric bill...
ID: 18079 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Biffa
Avatar

Send message
Joined: 11 Mar 09
Posts: 12
Credit: 151,458,400
RAC: 0
Message 18082 - Posted: 9 Apr 2009, 22:19:31 UTC

What CPU do you have?
ID: 18082 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 18085 - Posted: 9 Apr 2009, 22:41:45 UTC - in response to Message 18079.  

So the question is, how much wattage is my processor drawing while under load. I'm not asking this from a flops angle or a make more points angle, but trying to figure out if spending the money for the 3850 would have an ROI in terms of saving on the electric bill...

If you shut down one box for putting a GPU in another one you will definitely see some savings.

Or are you thinking about running the box only with MW-GPU and letting the CPU idle? Then it depends largely on the efficiency of the power saving features of the CPU you are using.
I think the former solution would make more sense as you don't come around the idle consumption of the box.
ID: 18085 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18087 - Posted: 9 Apr 2009, 22:59:47 UTC - in response to Message 18082.  

What CPU do you have?


It's an AMD Athlon64 3700+, but overclocked. Nominal clock rate is 2200MHz, but I've got it clocked to 2750MHz... Nominal voltage I think was 1.4-1.45, but I'm running around 1.63-1.66 to keep the overclock stable. Core temps usually run 58-62...
ID: 18087 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18088 - Posted: 9 Apr 2009, 23:06:00 UTC - in response to Message 18085.  

So the question is, how much wattage is my processor drawing while under load. I'm not asking this from a flops angle or a make more points angle, but trying to figure out if spending the money for the 3850 would have an ROI in terms of saving on the electric bill...

If you shut down one box for putting a GPU in another one you will definitely see some savings.

Or are you thinking about running the box only with MW-GPU and letting the CPU idle? Then it depends largely on the efficiency of the power saving features of the CPU you are using.
I think the former solution would make more sense as you don't come around the idle consumption of the box.


The former solution would work if I had a spare... The concept is if there was enough power savings by running the GPU instead of the CPU if I was participating here. Otherwise the system would be idle, and as you point out, there'd still be the idle consumption. As I see it, I'd need to know the consumption at load and then at idle, get that delta, perhaps knock a little off of the delta because I would think it would still draw some additional power to handle any AGP-main memory transfers, then project that out across some span of time. My guess is that it wouldn't be much savings, but I don't know... I'll probably have to get one of those meters unless there's a known accurate calculator site out there somewhere...
ID: 18088 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile banditwolf
Avatar

Send message
Joined: 12 Nov 07
Posts: 2425
Credit: 524,164
RAC: 0
Message 18092 - Posted: 9 Apr 2009, 23:15:19 UTC - in response to Message 18088.  

I'll probably have to get one of those meters unless there's a known accurate calculator site out there somewhere...


Not sure how accurate this is for your needs: Here
Doesn't expecting the unexpected make the unexpected the expected?
If it makes sense, DON'T do it.
ID: 18092 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jord
Avatar

Send message
Joined: 30 Aug 07
Posts: 125
Credit: 207,206
RAC: 0
Message 18093 - Posted: 9 Apr 2009, 23:16:44 UTC - in response to Message 18088.  

I don't know how much power my Sapphire HD3850 (AGP) draws, but it does come with a dual molex to 6 pin power connector which said enough to me. Although I first ran it on one power connector only (all through playing Far Cry 2) and that didn't make my system crash. ;-)

The thing you really have to watch out for is the size. These cards are long! 24 centimeters from end to end. make sure you have that space, also cos you still have to attach the power cable(s).
Jord.

The BOINC FAQ Service.
ID: 18093 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18103 - Posted: 9 Apr 2009, 23:59:42 UTC - in response to Message 18092.  
Last modified: 9 Apr 2009, 23:59:59 UTC

I'll probably have to get one of those meters unless there's a known accurate calculator site out there somewhere...


Not sure how accurate this is for your needs: Here


Yeah, that's saying 156W for the overclocked CPU. I dunno. Doubt it's worth it. At most I may save 60W by idling, so if that's the case, and the card draws 70-80, then I'd be drawing more power...
ID: 18103 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
STE\/E

Send message
Joined: 29 Aug 07
Posts: 486
Credit: 576,521,060
RAC: 35,479
Message 18106 - Posted: 10 Apr 2009, 0:06:47 UTC - in response to Message 18093.  
Last modified: 10 Apr 2009, 0:08:10 UTC

The thing you really have to watch out for is the size. These cards are long! 24 centimeters from end to end. make sure you have that space, also cos you still have to attach the power cable(s).


If I'm a little short of Space or if the Cards are like the ATI 4800 Series Cards where the power cords go in the end instead of the side like the NVIDIA Cards I just plug the power cords in first & then slide the card into place, seems to work for me anyway. I take um out the same way and unplug um after I have um out for enough.
ID: 18106 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18109 - Posted: 10 Apr 2009, 0:31:58 UTC - in response to Message 18106.  

The thing you really have to watch out for is the size. These cards are long! 24 centimeters from end to end. make sure you have that space, also cos you still have to attach the power cable(s).


If I'm a little short of Space or if the Cards are like the ATI 4800 Series Cards where the power cords go in the end instead of the side like the NVIDIA Cards I just plug the power cords in first & then slide the card into place, seems to work for me anyway. I take um out the same way and unplug um after I have um out for enough.


I have a 6800GT card as it is now and it was fairly long and has the molex connector on the end. I don't think it's worth it though. Doubtful that I'd save money, and even if I did, it would probably be 10W...so a long time to get a ROI...
ID: 18109 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile lllvette

Send message
Joined: 17 Jan 08
Posts: 6
Credit: 34,473,193
RAC: 0
Message 18128 - Posted: 10 Apr 2009, 1:28:50 UTC

@ Brian , Bite the bullet and pay the Electric Co..... Never look back....
ID: 18128 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18132 - Posted: 10 Apr 2009, 2:26:31 UTC - in response to Message 18128.  

@ Brian , Bite the bullet and pay the Electric Co..... Never look back....


Nah... Need to save money, that's the only reason I had considered it. It's also why I power down at night sometimes now, and why I get a chuckle when someone says that they "can't afford" for their systems to be idle. Actually, they can "better afford" it if their systems are idle...
ID: 18132 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 12 Apr 08
Posts: 621
Credit: 161,934,067
RAC: 0
Message 18137 - Posted: 10 Apr 2009, 5:29:04 UTC

I am still trying to figure out the context of this thread...

The cost of running the GPU will be about equal or higher than the CPU as a general rule depending on the capabilities of the GPU ...

So the only way I can see relevance to this question is watts per Cobblestones and if that is the question then the GPU is going to be more efficient ...

But I thought the point was to do the science ...
ID: 18137 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
Message 18144 - Posted: 10 Apr 2009, 8:43:51 UTC - in response to Message 18137.  

I am still trying to figure out the context of this thread...

The cost of running the GPU will be about equal or higher than the CPU as a general rule depending on the capabilities of the GPU ...

So the only way I can see relevance to this question is watts per Cobblestones and if that is the question then the GPU is going to be more efficient ...

But I thought the point was to do the science ...

Yes, but even science has to be done within a budget and an eye on "if I do it this way I could get more science done than spending my money that way".

I initilly decided to spend money on an ATI so that it would crunch the WUs of the PCs that I could then switch off, thus saving a heck of a lot of electricity bills. That is, one PC with ATI does the work of many PCs without ATI cards. Or, one PC with ATI is equivalent to a 'farm' but consuming far less electricity.

For this reason, yes, the science is being done at a lower (electricity bill) cost.

Cluster Physik is saving us money ;)


ID: 18144 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
Message 18157 - Posted: 10 Apr 2009, 12:05:56 UTC - in response to Message 18137.  
Last modified: 10 Apr 2009, 12:08:38 UTC

I am still trying to figure out the context of this thread...

The cost of running the GPU will be about equal or higher than the CPU as a general rule depending on the capabilities of the GPU ...

So the only way I can see relevance to this question is watts per Cobblestones and if that is the question then the GPU is going to be more efficient ...

But I thought the point was to do the science ...


The point is to do the science, but when your credit card company (Citi) jacked up your interest rate (nearly doubled) and you are carrying a balance (had to due to the long unemployment), even though you haven't missed a payment and have a good credit score, you start looking into ways to save money. My thought was that if I could decrease the wattage used by my whole system, I could save a bit on the power bill. I just don't think that I'd see a wattage reduction on the whole, or if I did it would not be enough to warrant a purchase with the ROI on that purchase not being for more than a year... So, I'll just continue to shut my system off from time to time...

Oh, and yes, I am looking into a way to dump Citi...
ID: 18157 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ExtraTerrestrial Apes
Avatar

Send message
Joined: 1 Sep 08
Posts: 204
Credit: 219,354,537
RAC: 0
Message 19513 - Posted: 19 Apr 2009, 10:27:36 UTC

Hi Brian,

saving money the way you're trying to achieve it will be tough. Some points to consider:

- your 6800GT also draws 20 - 25W at idle, so the added power draw due to a 3850 is reduced by this amount
- you can downclock the memory on the 3850 without a performance penalty, which might be good to reduce it's power consumption by about 10W, shrinking the delta even further
- not sure about the 3850, but on my 4870 I can control the GPU voltage via software. Undervolting, maybe along with a slight downclock, would improve its power efficiency (while still being much faster than your CPU)
- you'd need cool & quit active, otherwise you won't see much power saving by letting the CPU idle.. this can be tricky with a highly OC'ed A64
- the 3850 better be cheap (Ebay?), otherwise you're waiting a looong time for an ROI

What I'd probably do: reduce the OC on the CPU to about 2.4 GHz, which is the point at which it should be fine with 1.4 - 1.45V. This alone should reduce your system power draw by an estimated ~30W at a small performance hit. Downclock your 6800GT in 2D (I don't think it's doing this automatically), might save another 10W. Power off when you feel like it, switch the monitor off when not in use and ride this system a little longer, until your finances relax.

MrS
Scanning for our furry friends since Jan 2002
ID: 19513 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote

Message boards : Number crunching : Power consumption by GPUs

©2024 Astroinformatics Group