Welcome to MilkyWay@home

Future ATI Card Support?


Advanced search

Message boards : Number crunching : Future ATI Card Support?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · Next

AuthorMessage
Divide Overflow
Avatar

Send message
Joined: 16 Feb 09
Posts: 109
Credit: 11,089,510
RAC: 0
10 million credit badge12 year member badge
Message 22937 - Posted: 21 May 2009, 20:40:31 UTC

Has the project team departed from their original idea to do longer, more science intensive work with the GPU application?
ID: 22937 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile[P3D] Crashtest

Send message
Joined: 8 Jan 09
Posts: 58
Credit: 53,074,535
RAC: 48,402
50 million credit badge12 year member badge
Message 22938 - Posted: 21 May 2009, 20:43:41 UTC - in response to Message 22936.  

1000x bigger ? well calc'ed in SP with a ATI Radeon 4830 would take about 85min; on a 4870x2 about 31min - but on nVIDIA GPUs about 2h or more
ID: 22938 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileThe Gas Giant
Avatar

Send message
Joined: 24 Dec 07
Posts: 1947
Credit: 240,884,648
RAC: 0
200 million credit badge13 year member badge
Message 22939 - Posted: 21 May 2009, 20:46:05 UTC - in response to Message 22938.  

1000x bigger ? well calc'ed in SP with a ATI Radeon 4830 would take about 85min; on a 4870x2 about 31min - but on nVIDIA GPUs about 2h or more

Perfect then...
ID: 22939 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile[P3D] Crashtest

Send message
Joined: 8 Jan 09
Posts: 58
Credit: 53,074,535
RAC: 48,402
50 million credit badge12 year member badge
Message 22940 - Posted: 21 May 2009, 20:47:38 UTC - in response to Message 22939.  

but this kind of WU would be a 4000 to 8000 Credit WU (based on current Credits/WU : 2 )
ID: 22940 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22941 - Posted: 21 May 2009, 20:48:19 UTC - in response to Message 22934.  


ATI people won't stay with a project that has no work available when there is a project that supports ATI too even if credit is a little less but available.


Mostly true, yes, but you live in a part of the world that is more socialist than capitalist. Many of the people I'm thinking of will go with the fundamental principles of a Capitalist economy, which is they'll go to wherever they can maximize their profit.

Bottom line is that it would be best if GPUs can't play in the CPU sandbox and CPUs can't play in the GPU sandbox...

This is exactly what Travis said in the March 25th news post:

"After testing to make sure that it's working correctly we'll swap to only awarding credit to the GPU applications for GPU milkyway (that way our servers wont be as bogged down, which will mean better work availability)."
ID: 22941 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileuBronan
Avatar

Send message
Joined: 9 Feb 09
Posts: 166
Credit: 27,520,813
RAC: 0
20 million credit badge12 year member badge
Message 22942 - Posted: 21 May 2009, 20:54:06 UTC - in response to Message 22924.  

Did you copy the aticalx.dll files and rename them to amdcalx.dll?

Yes i have found its the can't write error so not sure if i can solve this since the fix seem only to work with 8.12 driver.

1000 x bigger

I would not have a problem with 100x bigger units because my cpu would crunch them nicely ;)
But i guess the slower machines out there would have a stroke.
Although i wonder if there are many slower machines here on MW other then the big network guys using their bosses systems ;) ( I was warned to get fired if i did ... )

Its new, its relative fast... my new bicycle
ID: 22942 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileThe Gas Giant
Avatar

Send message
Joined: 24 Dec 07
Posts: 1947
Credit: 240,884,648
RAC: 0
200 million credit badge13 year member badge
Message 22944 - Posted: 21 May 2009, 21:13:35 UTC - in response to Message 22942.  
Last modified: 21 May 2009, 21:39:28 UTC

1000 x bigger

I would not have a problem with 100x bigger units because my cpu would crunch them nicely ;)
But i guess the slower machines out there would have a stroke.
Although i wonder if there are many slower machines here on MW other then the big network guys using their bosses systems ;) ( I was warned to get fired if i did ... )

1000x bigger for the GPU only wu's...
ID: 22944 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 22945 - Posted: 21 May 2009, 21:16:42 UTC - in response to Message 22941.  

Mostly true, yes, but you live in a part of the world that is more socialist than capitalist. Many of the people I'm thinking of will go with the fundamental principles of a Capitalist economy, which is they'll go to wherever they can maximize their profit.
I have two cards that can do more than 140,000credits per day and i get about 90,000 actually
if these cards could get 120,000 credits with MWgpu what would you do? Really stay with MW nongpu for capitalistic reasons???

Another point is that i have to babysit MW a lot and i would love to see WUs that run a few hours for the price of little more real credit against a lot more possible but not available credit - a good capitalistic deal i think - maximum profit for minimum effort
ID: 22945 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22950 - Posted: 21 May 2009, 21:59:16 UTC - in response to Message 22942.  



1000 x bigger

I would not have a problem with 100x bigger units because my cpu would crunch them nicely ;)
But i guess the slower machines out there would have a stroke.
Although i wonder if there are many slower machines here on MW other then the big network guys using their bosses systems ;) ( I was warned to get fired if i did ... )


My computer, an AMD Athlon64 3700+ that is overclocked to basically the equivalent of a FX-57, which is the fastest non-HyperThreaded single-core system, takes around 30 minutes for the larger tasks we have now, 15-20 for the smaller. 30 * 100 = 3000 minutes / 60 = 50 hours. Deadline is at 72 hours. I would have to run the computer with BOINC running for about 70% of the deadline for the task to complete within deadline. The project has already said that 3 days is a bit long to wait for a task given the way the search is built upon the results of prior tasks, but they compromised and bumped up from 2 days to 3 days. Given all of this, increasing the complexity of the current tasks to levels of 100x the current level would be bad for the project from both a scientific and a Public Relations standpoint, as anything older than approximately a 3GHz P4 or an Athlon64 3400+ would not be able to make deadline on a consistent basis if they weren't completely dedicated systems to doing BOINC only and were not powered off. At present, based on available stats from the stat sites, that would eliminate at least 5,000 hosts, if not closer to 7,000 or 8,000 hosts...or about 15-20% of the total number of hosts attached to the project.

What was done was the best decision based on the available data...assuming it works... If it doesn't work, then I'm sure we'll hear from the ATI camp loudly...
ID: 22950 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22951 - Posted: 21 May 2009, 22:06:29 UTC - in response to Message 22945.  

Mostly true, yes, but you live in a part of the world that is more socialist than capitalist. Many of the people I'm thinking of will go with the fundamental principles of a Capitalist economy, which is they'll go to wherever they can maximize their profit.
I have two cards that can do more than 140,000credits per day and i get about 90,000 actually
if these cards could get 120,000 credits with MWgpu what would you do? Really stay with MW nongpu for capitalistic reasons???

Another point is that i have to babysit MW a lot and i would love to see WUs that run a few hours for the price of little more real credit against a lot more possible but not available credit - a good capitalistic deal i think - maximum profit for minimum effort


If you were able to get 120,000 / day with MW_GPU, but were able to get 125,000/day with MW "Classic" (CPU) utilizing the current application, would you switch? Yes, it's a hypothetical question for a situation that currently does not seem to exist for you, but if the load was lightened up enough by people with CUDA cards going over to the GPU project that it could happen, would you switch? Do you think everyone would switch? Do you think that 50% would switch? 33%? 25? 10?

In theory, this situation should not happen based on the concept of being given the same amount of credit as here. Personally, based on the amount of continual complaining that's gone on here, I wouldn't mind it in the slightest if they increased the amount of credit given to GPUs running the GPU project to totally incentivize a move to the other project so that everyone can enjoy some peace and quiet.
ID: 22951 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 22953 - Posted: 21 May 2009, 22:11:30 UTC - in response to Message 22951.  

If the situation is lightened where is your problem then?
ID: 22953 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22955 - Posted: 21 May 2009, 22:28:32 UTC - in response to Message 22953.  

If the situation is lightened where is your problem then?


I apparently have to enable "maximum verbosity" mode... :sigh:

As it stands, you folks with the GPU cards are capping the server. You've even convinced someone who doesn't have a GPU card that the project is putting artificial limits in on how many tasks are created. If the people with CUDA cards who don't have ATI cards leave, it will lighten the load, but nobody should be expecting the problems here to completely go away by just CUDA being supported. What will happen is it will become *SLIGHTLY* better. This will appease some people temporarily, but not for very long. If then ATI support is added to the GPU project, and enough people head over there, but a few "industrious" people figure out that they can still participate here and, as fate would have it, they could get more credits here, those "industrious" people would surely be right back here hammering away at the server. Eventually more and more of the competitive types would figure this out and we'd have issues here again.

Just be happy that you're getting your own sandbox, ok? Like I said, I don't even care if they want to increase the amount of credit they give you folks if it means you move over there...
ID: 22955 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileSabroe_SMC
Avatar

Send message
Joined: 2 Aug 08
Posts: 24
Credit: 329,078,735
RAC: 118,867
300 million credit badge13 year member badgeextraordinary contributions badge
Message 22956 - Posted: 21 May 2009, 22:43:56 UTC

Ok guys. The Code for the CUDA app is out. Who is able to help Travis. Look into the Code Applcation Thread
ID: 22956 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileThe Gas Giant
Avatar

Send message
Joined: 24 Dec 07
Posts: 1947
Credit: 240,884,648
RAC: 0
200 million credit badge13 year member badge
Message 22992 - Posted: 22 May 2009, 8:58:08 UTC - in response to Message 22951.  
Last modified: 22 May 2009, 8:59:47 UTC

Mostly true, yes, but you live in a part of the world that is more socialist than capitalist. Many of the people I'm thinking of will go with the fundamental principles of a Capitalist economy, which is they'll go to wherever they can maximize their profit.
I have two cards that can do more than 140,000credits per day and i get about 90,000 actually
if these cards could get 120,000 credits with MWgpu what would you do? Really stay with MW nongpu for capitalistic reasons???

Another point is that i have to babysit MW a lot and i would love to see WUs that run a few hours for the price of little more real credit against a lot more possible but not available credit - a good capitalistic deal i think - maximum profit for minimum effort


If you were able to get 120,000 / day with MW_GPU, but were able to get 125,000/day with MW "Classic" (CPU) utilizing the current application, would you switch? Yes, it's a hypothetical question for a situation that currently does not seem to exist for you, but if the load was lightened up enough by people with CUDA cards going over to the GPU project that it could happen, would you switch? Do you think everyone would switch? Do you think that 50% would switch? 33%? 25? 10?

In theory, this situation should not happen based on the concept of being given the same amount of credit as here. Personally, based on the amount of continual complaining that's gone on here, I wouldn't mind it in the slightest if they increased the amount of credit given to GPUs running the GPU project to totally incentivize a move to the other project so that everyone can enjoy some peace and quiet.

Yes, but if in theory you could get 125,000 by running the cpu wu's on your gpu's but due to wu limitations at the cpu project you could only get 70,000 due to a wu shortage, but could get 120,000 by running the gpu wu's and have no wu shortages then I know what I would run!
ID: 22992 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profileverstapp
Avatar

Send message
Joined: 26 Jan 09
Posts: 589
Credit: 497,834,261
RAC: 0
300 million credit badge12 year member badge
Message 22995 - Posted: 22 May 2009, 9:53:00 UTC

> I know what I would run!
I don't. Your example was too complicated for my tiny brain. :)
Cheers,

PeterV

.
ID: 22995 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 23007 - Posted: 22 May 2009, 11:55:24 UTC - in response to Message 22992.  

I know what I would run!


Travis has stated that they intend on not having the GPU apps get credit here once they are up and running over there...
ID: 23007 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfilePaul D. Buck

Send message
Joined: 12 Apr 08
Posts: 621
Credit: 161,934,067
RAC: 0
100 million credit badge13 year member badge
Message 23016 - Posted: 22 May 2009, 13:12:47 UTC - in response to Message 23007.  

I know what I would run!


Travis has stated that they intend on not having the GPU apps get credit here once they are up and running over there...

And it could be as simple as having the validator mark as invalid any tasks that are completed too quickly. Abnormally low run time = GPU processing.
ID: 23016 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
John Clark

Send message
Joined: 4 Oct 08
Posts: 1734
Credit: 64,228,409
RAC: 0
50 million credit badge13 year member badge
Message 23023 - Posted: 22 May 2009, 14:57:46 UTC - in response to Message 23016.  
Last modified: 22 May 2009, 14:59:27 UTC

And it could be as simple as having the validator mark as invalid any tasks that are completed too quickly. Abnormally low run time = GPU processing.


I think that may be too simple when the AMD 955 and Intel i7s and Penryn class CPUs are concerned. Overclocked versions usually finish a WU of the ps_s23_17_ type, giving credit of 28.63 CS, in between 500 and 620 seconds.

Would this speed be classified as a GPU?

Also, if the output is used as another measure, a quad Penryn class PC will achieve RACs between 15K and 16K.

Too close to the HD3850 GPU outputs and speeds for me to be comfortable and for the classification I presume the GPU client will give it away?
Go away, I was asleep


ID: 23023 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfilePaul D. Buck

Send message
Joined: 12 Apr 08
Posts: 621
Credit: 161,934,067
RAC: 0
100 million credit badge13 year member badge
Message 23028 - Posted: 22 May 2009, 17:05:40 UTC - in response to Message 23023.  

And it could be as simple as having the validator mark as invalid any tasks that are completed too quickly. Abnormally low run time = GPU processing.


I think that may be too simple when the AMD 955 and Intel i7s and Penryn class CPUs are concerned. Overclocked versions usually finish a WU of the ps_s23_17_ type, giving credit of 28.63 CS, in between 500 and 620 seconds.

Would this speed be classified as a GPU?

Since I am, and never have been, a fan of over-clocking when the point is scientific accuracy, would not bother me to have them classed so ...

HOwever, The GPU processing times were down in the double and single digits of seconds. So, even a discriminator of 100 seconds would still be plenty of leeway...

In the same line, I am not a fan of the scriptors who in the pursuit of their own ego run scripts to pound on the server, when the main problem is server load ... in that light I am sure that there are going to be those that work diligently to fuel their own egos at the expense of the project's welfare and health.

So, I am sure that someone is going to spend enormous amounts of effort to make a GPU application and framework so that they can pull more work from here instead of working with the needs of the project.
ID: 23028 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 23036 - Posted: 22 May 2009, 18:55:04 UTC - in response to Message 23007.  

Travis has stated that they intend on not having the GPU apps get credit here once they are up and running over there...

I know how my gpus look like a slightly overclocked dual xeon W5580 >:D
ID: 23036 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · Next

Message boards : Number crunching : Future ATI Card Support?

©2021 Astroinformatics Group