Welcome to MilkyWay@home

GPU Requirements


Advanced search

Message boards : Number crunching : GPU Requirements
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · 14 . . . 15 · Next

AuthorMessage
ProfileSun Badger*

Send message
Joined: 16 Nov 09
Posts: 7
Credit: 4,901,656
RAC: 0
3 million credit badge12 year member badge
Message 56648 - Posted: 27 Dec 2012, 18:21:06 UTC

It is strange; I did not understand; however they are in my validations. Yesterday I did not see them on the Bonic Manager screen I am looking for them now, still no sight. These are the (.25-cpu) that shows the GPU crunching on the main screen. When I look on MW web page, under my account, I did see that there were GPU task running? Thinking this is crazy. I OC'ed my machine went to bed . When I got up I rushed to the rat hole computer room. Looking at the on line MW my account status I started looking at work that had been validated and there they were; GPU task issued and validated the near most of them issued and crunched in one day. It is kinda like a tail sticking out of a closed door, could be a great puppy or a angry Boar. I am just not going to pull on the tail, because with my luck it would be a bunch of Boars and Sows in heat, and that would make a bad day. So I will not inquiring about the door and tail, or even acknowledge their existence. Thanks for letting me ramble. This stuff will make you crazier than a sh... house rat!
ID: 56648 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
robertmiles

Send message
Joined: 30 Sep 09
Posts: 211
Credit: 34,870,096
RAC: 5,933
30 million credit badge12 year member badgeextraordinary contributions badge
Message 56680 - Posted: 31 Dec 2012, 23:06:55 UTC - in response to Message 56648.  

The GTX 6nn and GT 6nn cards have some double precision capability, but not as much as similar cards in the GTX5 5nn or GT 5nn series. I haven't seen anything yet on whether the MilkyWay@Home workunits are capable of recognizing this difference and using only the double precision capability that is actually there. If not, they may insist on running on the CPU only.
ID: 56680 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile(retired account)
Avatar

Send message
Joined: 22 Oct 11
Posts: 9
Credit: 21,438,521
RAC: 0
20 million credit badge10 year member badge
Message 56714 - Posted: 4 Jan 2013, 8:11:47 UTC - in response to Message 56089.  

Michel,

MW requires Double Precision arithmetic. The HD 67xx series doesn't support DP.

Joe


I can confirm this.

The 6xxxx series cards that do support DP are limited to the 6970 and 6950, and the same for the 7xxx series GPUs.


Not quite correct, the 7750, 7770, 7850 & 7870 all support DP. Just at 1/16th the single precision rate.


The mobile HD 7970M aka 'Wimbledon XT' also supports DP and works with MW. Technically it is similar to a desktop HD 7870 aka 'Pitcairn' but GPU clock is only 850 instead of 1000 MHz. Here's a current valid result:

http://milkyway.cs.rpi.edu/milkyway/result.php?resultid=374577906

and from stderr:

Using device 0 on platform 0
Found 1 CL device
Device 'Pitcairn' (Advanced Micro Devices, Inc.:0x1002) (CL_DEVICE_TYPE_GPU)
Driver version: 1084.4 (VM)
Version: OpenCL 1.2 AMD-APP (1084.4)
Compute capability: 0.0
Max compute units: 20
Clock frequency: 850 Mhz
Global mem size: 2147483648
Local mem size: 32768
Max const buf size: 65536
Double extension: cl_khr_fp64

(...)

Estimated AMD GPU GFLOP/s: 170 SP GFLOP/s, 34 DP FLOP/s
Warning: Bizarrely low flops (34). Defaulting to 100
ID: 56714 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Patrac999

Send message
Joined: 15 Dec 10
Posts: 1
Credit: 695,037
RAC: 0
500 thousand credit badge11 year member badge
Message 56884 - Posted: 14 Jan 2013, 18:12:35 UTC

So it´s normal when milkyway uses only 5 - 10% of my ati 7770?
ID: 56884 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
M0CZY
Avatar

Send message
Joined: 26 Jun 09
Posts: 14
Credit: 254,714
RAC: 11
100 thousand credit badge12 year member badge
Message 57103 - Posted: 31 Jan 2013, 17:13:06 UTC

Before I go out and spend the money, I would like to know whether an Nvidia GeForce GT 610 would be a suitable GPU for this project.
According to this page, it is CUDA-Enabled, with a Compute Capability of 2.1
https://developer.nvidia.com/cuda-gpus
My computer only has a 305W PSU, so I can't really put in anything much more potent than this.
If it is unsuitable, maybe it could be used for other projects?
The biggest threat to public safety and security is not terrorism, it is Government abuse of authority.
Bitcoin Donations: 1Le52kWoLz42fjfappoBmyg73oyvejKBR3
ID: 57103 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
robertmiles

Send message
Joined: 30 Sep 09
Posts: 211
Credit: 34,870,096
RAC: 5,933
30 million credit badge12 year member badgeextraordinary contributions badge
Message 57112 - Posted: 31 Jan 2013, 22:35:38 UTC - in response to Message 57103.  
Last modified: 31 Jan 2013, 22:46:04 UTC

The GT 6xx series doesn't do double precision as fast as the GT 5xx series, and this project uses double precision heavily. You might want to check whether there's anything in the GT 5xx series that fits the power supply limit.

It will probably be able to run the applications anyway - but the question is whether it will run them fast enough.

Also, don't confuse the GT 610 with the GT 610M; one won't fit in a socket for the other.
ID: 57112 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
M0CZY
Avatar

Send message
Joined: 26 Jun 09
Posts: 14
Credit: 254,714
RAC: 11
100 thousand credit badge12 year member badge
Message 57199 - Posted: 8 Feb 2013, 15:26:17 UTC

It will probably be able to run the applications anyway - but the question is whether it will run them fast enough.

My new GT 610 is turning in the current work units in about 70 minutes each.
Compared to my slow CPU, this seems very quick to me!
The biggest threat to public safety and security is not terrorism, it is Government abuse of authority.
Bitcoin Donations: 1Le52kWoLz42fjfappoBmyg73oyvejKBR3
ID: 57199 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Richard Haselgrove

Send message
Joined: 4 Sep 12
Posts: 219
Credit: 449,588
RAC: 0
100 thousand credit badge9 year member badge
Message 57200 - Posted: 8 Feb 2013, 17:10:00 UTC - in response to Message 57112.  

The GT 6xx series doesn't do double precision as fast as the GT 5xx series, and this project uses double precision heavily. You might want to check whether there's anything in the GT 5xx series that fits the power supply limit.

It will probably be able to run the applications anyway - but the question is whether it will run them fast enough.

Also, don't confuse the GT 610 with the GT 610M; one won't fit in a socket for the other.

The GT 610 card is based on the GF119 (Fermi-technology) chip, so it has more in common with the 5xx ranges that with its bigger GK (Kepler) siblings in the GTX 6xx ranges.

Comparison of Nvidia graphics processing units
ID: 57200 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileGreg

Send message
Joined: 13 Dec 08
Posts: 3
Credit: 23,340,710
RAC: 0
20 million credit badge13 year member badge
Message 57204 - Posted: 9 Feb 2013, 8:50:35 UTC

Hi,
I use a HD6990 and a HD7950 But I am not able to use them both on the same MB using WIN7 or WIN8. When I have them on one machine the work units are done at the same speed as having only the HD7950 in on its own. I have removed all the AMD drivers and re installed them but the result is the same. with or without crossfire. Is there a trick to using different GPU cards on one machine?
Thanks for your reply.
ID: 57204 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
John G

Send message
Joined: 1 Apr 10
Posts: 49
Credit: 171,863,025
RAC: 0
100 million credit badge12 year member badge
Message 57205 - Posted: 9 Feb 2013, 8:55:02 UTC

Greg
Try plugging both cards into seperate monitors, this should cure the problem

Regards
ID: 57205 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profilemikey
Avatar

Send message
Joined: 8 May 09
Posts: 2821
Credit: 463,358,641
RAC: 10,792
300 million credit badge13 year member badgeextraordinary contributions badge
Message 57207 - Posted: 9 Feb 2013, 12:36:30 UTC - in response to Message 57205.  

Greg
Try plugging both cards into seperate monitors, this should cure the problem

Regards


Sometimes you need a file text called cc_config.xml that tells Boinc to use all the gpu's it sees. Here is an example of one:

<cc_config>
<options>
<use_all_gpus>1</use_all_gpus>
<skip_cpu_benchmarks>1</skip_cpu_benchmarks>
</options>
</cc_config>

Use NOTEPAD and just copy and paste the file in the Boinc directory saving it as a txt type file. Stop and restart Boinc and it should download and run the units on each card separately. One thing though, you will LOSE EVERY UNIT you currently have if you do this while you have units in the cache. So it is better to run your cache down before doing it. Then as John G said you MUST either make a 'dummy plug' or plug you 2nd card in a monitor on Windows startup for it to use each card separately. A 'dummy plug' how to can be found here:
http://www.overclock.net/t/384733/the-30-second-dummy-plug

It REALLY is as easy as they show and it works just fine. I buy my resistors at Radio Shack and they are a couple of bucks for a pack of 5 of them.
ID: 57207 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileGreg

Send message
Joined: 13 Dec 08
Posts: 3
Credit: 23,340,710
RAC: 0
20 million credit badge13 year member badge
Message 57280 - Posted: 18 Feb 2013, 21:57:00 UTC

If BOINC is only using one graphics card, but it should be using more than one.
It may be necessary to follow a few steps.
1. Connect at least one monitor to each graphics card.
2. Place cc_config.xml into the correct folder.
3. Restart BOINC

Thank you for the help from this Message board.
I was able to solve the problem with the following points.
Not all operating systems let you change a file from .txt to .xml
so copy any .xml file from the /ProgramData/BOINC folder onto the desk top,
change the name of the file on the desktop to cc_config.xml
use notepad to change the contents of cc_config.xml that is on the desktop to:

<cc_config>
<options>
<use_all_gpus>1</use_all_gpus>
<skip_cpu_benchmarks>1</skip_cpu_benchmarks>
</options>
</cc_config>

Save the file into the correct folder (/ProgramData/BOINC)

delete the file that is on the desktop.
ID: 57280 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profileskgiven
Avatar

Send message
Joined: 22 Dec 07
Posts: 35
Credit: 18,433,204
RAC: 0
10 million credit badge14 year member badge
Message 57287 - Posted: 19 Feb 2013, 18:48:36 UTC - in response to Message 57280.  

The GTX Titan should do well here; it's FP64 performance is 1/3 that of it's FP32.

'NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute'
by Ryan Smyth, anandtech.com
ID: 57287 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
HA-SOFT, s.r.o.

Send message
Joined: 26 Feb 13
Posts: 2
Credit: 1,400,786
RAC: 0
1 million credit badge9 year member badge
Message 57381 - Posted: 28 Feb 2013, 15:05:37 UTC - in response to Message 57287.  

The GTX Titan should do well here; it's FP64 performance is 1/3 that of it's FP32.

'NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute'
by Ryan Smyth, anandtech.com


I have tested Titan last day on milkyway and it works ok. When doubleprecision is enabled it can run 7 tasks in parallel (~97% GPU, 80% of TDP=200W) and can produce cca 300k credits per day.
ID: 57381 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jeroen

Send message
Joined: 11 Dec 08
Posts: 8
Credit: 14,977,931
RAC: 0
10 million credit badge13 year member badge
Message 57391 - Posted: 1 Mar 2013, 3:21:46 UTC - in response to Message 57381.  
Last modified: 1 Mar 2013, 3:23:32 UTC


I have tested Titan last day on milkyway and it works ok. When doubleprecision is enabled it can run 7 tasks in parallel (~97% GPU, 80% of TDP=200W) and can produce cca 300k credits per day.


Thanks for sharing your results. Could you try running one task at a time with the 1/3 DP option enabled and post the runtime per task? I wanted to see how the Titan runs compared to the 7970 with a single task running.

Also, do the overclocking and power target options work with DP mode enabled?
ID: 57391 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
HA-SOFT, s.r.o.

Send message
Joined: 26 Feb 13
Posts: 2
Credit: 1,400,786
RAC: 0
1 million credit badge9 year member badge
Message 57392 - Posted: 1 Mar 2013, 8:48:46 UTC - in response to Message 57391.  

Times for one task is the same as for 7 tasks in parallel = ~500s.

msiasfterburner 3.0.0 beta 5 allows me to oveclock card with dp enabled, but I didn't test it.
ID: 57392 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Paul and kirsty yates
Avatar

Send message
Joined: 25 Feb 13
Posts: 3
Credit: 4,257,694
RAC: 43
3 million credit badge9 year member badge
Message 57428 - Posted: 5 Mar 2013, 1:27:40 UTC

have a Geforce 6600gt will it work on this project ??
i seem to be crunching w/u hope i'm not wasting my time
ID: 57428 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
robertmiles

Send message
Joined: 30 Sep 09
Posts: 211
Credit: 34,870,096
RAC: 5,933
30 million credit badge12 year member badgeextraordinary contributions badge
Message 57429 - Posted: 5 Mar 2013, 2:32:14 UTC - in response to Message 57428.  

have a Geforce 6600gt will it work on this project ??
i seem to be crunching w/u hope i'm not wasting my time


See the first post in this thread. The 6600gt does not support double precision, and therefore will not work on this project.

A few more astronomy-related projects you might want to try it on:

Einstein@Home
http://einstein.phys.uwm.edu/

SETI@Home
http://setiathome.berkeley.edu/

ID: 57429 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profilearkayn
Avatar

Send message
Joined: 14 Feb 09
Posts: 999
Credit: 74,932,619
RAC: 0
50 million credit badge13 year member badge
Message 57430 - Posted: 5 Mar 2013, 3:43:20 UTC - in response to Message 57428.  

have a Geforce 6600gt will it work on this project ??
i seem to be crunching w/u hope i'm not wasting my time


That video card is not capable of crunching at all, the first Nvidia card with CUDA was the 8xxx series.
ID: 57430 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Paul and kirsty yates
Avatar

Send message
Joined: 25 Feb 13
Posts: 3
Credit: 4,257,694
RAC: 43
3 million credit badge9 year member badge
Message 57433 - Posted: 5 Mar 2013, 19:19:55 UTC

ohhh?
i have credit for compleated w/u
and 2 awaiting validation


ID: 57433 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 . . . 8 · 9 · 10 · 11 · 12 · 13 · 14 . . . 15 · Next

Message boards : Number crunching : GPU Requirements

©2022 Astroinformatics Group