Welcome to MilkyWay@home

GPU app teaser

Message boards : Application Code Discussion : GPU app teaser
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 . . . 19 · Next

AuthorMessage
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 9229 - Posted: 26 Jan 2009, 21:40:11 UTC
Last modified: 26 Jan 2009, 22:09:44 UTC

As I already mentioned in some other posts, I am working on an implementation of the MW application on ATI graphics hardware (RV670 and up, that means HD38x0, HD4670?? and HD48x0). It is not finished yet (still some scheduling problems as BOINC only supports CUDA atm), but it calculates some results that are in line with the requirements (at least when running offline, have to ask Travis for this ones).

But I guess you want to know how fast it is! So without further fuss here is a screenshot for you (done on a PhenomII 940BE with a Radeon HD4870):



That's something I would call fast!
ID: 9229 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
UBT - Ben

Send message
Joined: 8 Mar 08
Posts: 17
Credit: 4,411,459
RAC: 0
Message 9232 - Posted: 26 Jan 2009, 22:12:50 UTC - in response to Message 9229.  

Holy mother of moo moo's... 8seconds per WU!!

Great to see some people can develop such an application on a GPU (not the easiest of things!) Just a shame that the BOINC dev's have decided to cast ATI to one side and adopt only CUDA is all i can say.

Well done, keep it up :)
ID: 9232 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Temujin

Send message
Joined: 12 Oct 07
Posts: 77
Credit: 404,471,187
RAC: 0
Message 9233 - Posted: 26 Jan 2009, 22:35:23 UTC - in response to Message 9229.  

I agree, excellent work!!

I know nvidia cards aren't as fast as ATI for this kind of work but when you have completed the ATI app would you consider a CUDA app?
ID: 9233 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
Message 9235 - Posted: 26 Jan 2009, 22:44:38 UTC

I've just had a go at knocking out my first GPU WU which recently completed giving me nearly 4,000 credits.

I was very impressed, but much more so with the fact that all the while my laptop was churning out 2 MW WUs every 8 minutes or so, at the same time.

I knew VM would eventually let me squeeze out more.

But this is all nothing compared to how I now have an optimized signature with 6 images and 5 links. Who can beat that ? :P



ID: 9235 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Misfit
Avatar

Send message
Joined: 27 Aug 07
Posts: 915
Credit: 1,503,319
RAC: 0
Message 9244 - Posted: 27 Jan 2009, 1:53:28 UTC - in response to Message 9235.  

I've just had a go at knocking out my first GPU WU which recently completed giving me nearly 4,000 credits.

I was very impressed, but much more so with the fact that all the while my laptop was churning out 2 MW WUs every 8 minutes or so, at the same time.

I knew VM would eventually let me squeeze out more.

But this is all nothing compared to how I now have an optimized signature with 6 images and 5 links. Who can beat that ? :P

Travis, please transfer those 4,000 credits to my account. That beats all.
me@rescam.org
ID: 9244 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 12 Apr 08
Posts: 621
Credit: 161,934,067
RAC: 0
Message 9275 - Posted: 27 Jan 2009, 21:40:42 UTC

Now there is the making of a conundrum ... which card to get ... ATI so you can do Milky Way or Nvida so you can do SaH and / or GPU Grid?

Or different cards for different systems ... :)

Just a side note, it is possible that GPU Grid got the CPU load issue under control, at least at first blush ... I have one task in flight right now with 1% tops CPU load ... over 10% done in two hours and just over a minute of CPU time ...
ID: 9275 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
jedirock
Avatar

Send message
Joined: 8 Nov 08
Posts: 178
Credit: 6,140,854
RAC: 0
Message 9276 - Posted: 27 Jan 2009, 21:41:23 UTC - in response to Message 9229.  

w00t, sign me up! Just been waiting to get my 4870 1GB rolling. I was waiting on AI because they're working on an official app last I heard, but Milkyway will work too! :-D
ID: 9276 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile DoctorNow
Avatar

Send message
Joined: 28 Aug 07
Posts: 146
Credit: 10,703,601
RAC: 0
Message 9290 - Posted: 28 Jan 2009, 5:50:07 UTC - in response to Message 9229.  
Last modified: 28 Jan 2009, 5:51:13 UTC

That's really crazy. Now I wish I had an ATI card...
But:
Does this break the credit limit barrier of 108? Wonder if.
Otherwise it's only another method to crunch more WUs without getting more credits from it.
Member of BOINC@Heidelberg and ATA!

My BOINCstats
ID: 9290 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
Message 9291 - Posted: 28 Jan 2009, 7:20:09 UTC - in response to Message 9290.  

Otherwise it's only another method to crunch more WUs without getting more credits from it.

You mean, like work without pay?



ID: 9291 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 9294 - Posted: 28 Jan 2009, 10:53:41 UTC - in response to Message 9291.  

Otherwise it's only another method to crunch more WUs without getting more credits from it.

You mean, like work without pay?


Not without pay, but close to it!
With the current limits (especially the 300WU limit per core), you get only about 320 credits/day on that system with the GPU app *lol*

If it is really going to work, Travis has to do something on this.
ID: 9294 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile GalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
Message 9296 - Posted: 28 Jan 2009, 11:53:37 UTC - in response to Message 9294.  
Last modified: 28 Jan 2009, 11:54:00 UTC

Not without pay, but close to it!


Gulp! I can't image anyone providing computer resource to a BOINC project like this and getting almost negligible credit results in return.

ID: 9296 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile Phil
Avatar

Send message
Joined: 13 Feb 08
Posts: 1124
Credit: 46,740
RAC: 0
Message 9299 - Posted: 28 Jan 2009, 13:37:04 UTC - in response to Message 9296.  
Last modified: 28 Jan 2009, 13:56:13 UTC

Not without pay, but close to it!


Gulp! I can't image anyone providing computer resource to a BOINC project like this and getting almost negligible credit results in return.


Just think of it as a small fan-heater that supplies 8k+ MW units/day when its not needed as a graphics card.
ID: 9299 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 9310 - Posted: 28 Jan 2009, 20:36:14 UTC
Last modified: 28 Jan 2009, 20:47:05 UTC

If someone has thought, the screenshot may be a fake, there is a video of the running app.

I've implemented a solution to the scheduling problem (at least for Windows) that seems to be working on that specific machine. It is able to calculate several WUs at once, but that doesn't raise the throughput of the app, so calculating 4 WUs at the same time on a quadcore isn't faster than one after another. It uses up one core independent on the number of concurrent WUs.
In principle it works like a small scheduler that switches between the WUs 20 times a second (HD4870, 6 times a second on a HD3850). The video shows that GPU-Z is somehow incompatible with this approach as one experiences severe lags or short freezes of the system after it is started.

Thanks Emploi for that video!
ID: 9310 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
UBT - Ben

Send message
Joined: 8 Mar 08
Posts: 17
Credit: 4,411,459
RAC: 0
Message 9311 - Posted: 28 Jan 2009, 21:19:46 UTC - in response to Message 9310.  

If someone has thought, the screenshot may be a fake, there is a video of the running app.

I've implemented a solution to the scheduling problem (at least for Windows) that seems to be working on that specific machine. It is able to calculate several WUs at once, but that doesn't raise the throughput of the app, so calculating 4 WUs at the same time on a quadcore isn't faster than one after another. It uses up one core independent on the number of concurrent WUs.
In principle it works like a small scheduler that switches between the WUs 20 times a second (HD4870, 6 times a second on a HD3850). The video shows that GPU-Z is somehow incompatible with this approach as one experiences severe lags or short freezes of the system after it is started.

Thanks Emploi for that video!



Jeeeeeezzzz... And i thought you mean't 1 result every 10 or so seconds! But 4!! Wohay, why the hell did i get Nvidia! :P

Great app!
ID: 9311 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 9319 - Posted: 28 Jan 2009, 22:38:17 UTC - in response to Message 9311.  

Jeeeeeezzzz... And i thought you mean't 1 result every 10 or so seconds! But 4!! Wohay, why the hell did i get Nvidia! :P

Great app!

It's 4 WUs every 35 to 40 seconds or so. It is effectively every 9 seconds a WU.

If you lower the resource share (calculate some other projects in parallel), the app will calculate less MW-WUs in parallel (you can run other projects in parallel), but the absolute troughput will stay the same (1 WU per 9 seconds).
ID: 9319 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
zombie67 [MM]
Avatar

Send message
Joined: 29 Aug 07
Posts: 115
Credit: 502,527,662
RAC: 0
Message 9338 - Posted: 29 Jan 2009, 8:08:04 UTC - in response to Message 9276.  

w00t, sign me up! Just been waiting to get my 4870 1GB rolling. I was waiting on AI because they're working on an official app last I heard, but Milkyway will work too! :-D


I have an ATI Radeon X1900 in my Mac. Two questions.

1) "...that means HD38x0, HD4670?? and HD48x0" What the heck does that mean? Which cards/chips are we talking about?

2) Will this app work on Macs?

ID: 9338 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Cluster Physik

Send message
Joined: 26 Jul 08
Posts: 627
Credit: 94,940,203
RAC: 0
Message 9340 - Posted: 29 Jan 2009, 8:54:22 UTC - in response to Message 9338.  

Two questions.

1) "...that means HD38x0, HD4670?? and HD48x0" What the heck does that mean? Which cards/chips are we talking about?

2) Will this app work on Macs?

1. Consider the x as a wildcard. It runs on RV670 and RV770 class GPUs (and anything newer supporting double precision, so it's forward compatible). That are HD 3850, HD3870, HD4850 and HD 4870 (most probably HD4830 too). As for the RV730 (HD4670 and HD4650) there is some contradicting information. It is tested not to run with the current driver, but this may or may not change with a newer driver revision.
2. Macs are not supported (you can't even download a driver for the newer cards from AMD, so I suppose they wouldn't run). But there is potential support for Linux (not implemented yet).
ID: 9340 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
sandro

Send message
Joined: 17 Oct 08
Posts: 16
Credit: 16,783
RAC: 0
Message 9341 - Posted: 29 Jan 2009, 8:54:38 UTC - in response to Message 9338.  

[quote]

I have an ATI Radeon X1900 in my Mac. Two questions.

1) "...that means HD38x0, HD4670?? and HD48x0" What the heck does that mean? Which cards/chips are we talking about?

2) Will this app work on Macs?


your X1900 will not work. you need at least a card with the RV670 chip (3850, 3870) or with RV770 chip (4830, 4850, 4870).
In principle all ATIs starting from the RV670 should work, but sometimes the double precision is locked by either the bios or driver (4670 refused to work)
ID: 9341 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile L@MiR
Avatar

Send message
Joined: 21 Jan 09
Posts: 5
Credit: 14,571,494
RAC: 0
Message 9372 - Posted: 30 Jan 2009, 0:03:39 UTC

8 tasks on 1 gpu ;)

http://de.youtube.com/watch?v=nnsW-zB95Is (new video)
ID: 9372 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
jedirock
Avatar

Send message
Joined: 8 Nov 08
Posts: 178
Credit: 6,140,854
RAC: 0
Message 9380 - Posted: 30 Jan 2009, 1:04:33 UTC - in response to Message 9372.  

Is there any way to get the application now, or is it just open to Planet 3DNow members?
ID: 9380 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · 4 . . . 19 · Next

Message boards : Application Code Discussion : GPU app teaser

©2024 Astroinformatics Group