Welcome to MilkyWay@home

Future ATI Card Support?


Advanced search

Message boards : Number crunching : Future ATI Card Support?
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 · Next

AuthorMessage
ProfileThe Gas Giant
Avatar

Send message
Joined: 24 Dec 07
Posts: 1947
Credit: 240,884,648
RAC: 0
200 million credit badge13 year member badge
Message 22820 - Posted: 20 May 2009, 21:31:41 UTC

Once the new CUDA app has been released and the folks with Nvidia cards jump ship to the new project, what is the expected time line for an official app for the ATI graphics card?
ID: 22820 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Jord
Avatar

Send message
Joined: 30 Aug 07
Posts: 125
Credit: 207,206
RAC: 0
100 thousand credit badge14 year member badge
Message 22821 - Posted: 20 May 2009, 21:33:32 UTC - in response to Message 22820.  

As long as BOINC doesn't support detection of ATI GPUs yet, any 'official' application still has to use the anonymous platform or a wrapper.
Jord.

The BOINC FAQ Service.
ID: 22821 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Alinator

Send message
Joined: 7 Jun 08
Posts: 464
Credit: 56,639,936
RAC: 0
50 million credit badge13 year member badge
Message 22831 - Posted: 20 May 2009, 21:48:29 UTC

Well, that would be tied to whenever BOINC itself provides ATi support directly in the backend and CC, or everyone settles on OpenCL at some point as the best compromise for general purpose use.

I can't see the guys at MW wasting their time doing Berkeley's, the manufacturer's, and/or the standards organizations work (unless they've committed to it as members of those groups, that is), given the relatively low percentage of GPU capable crunchers in the BOINC population as a whole right now.

Keep in mind, the way it would most likely work is that if you have a CUDA card, running it on GPU would be painless (term used loosely given it's current state of support in BOINC), and you'd have to run ATi apps the same way there as you do here. It would be pretty pointless to go through all the trouble of creating the GPU project, and then allow the ATi's to continue to run here.

Alinator
ID: 22831 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileThe Gas Giant
Avatar

Send message
Joined: 24 Dec 07
Posts: 1947
Credit: 240,884,648
RAC: 0
200 million credit badge13 year member badge
Message 22864 - Posted: 21 May 2009, 4:32:37 UTC

I understand the issue of 'official' BOINC support, but the project still has the ability to release an app for ATI GPU's that uses the app_info.xml - anonamous route. So the question still is - Will the project release an official ATI app or will all the folks who have purchased ATI cards be left out in the cold?
ID: 22864 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22878 - Posted: 21 May 2009, 11:00:25 UTC - in response to Message 22864.  

I understand the issue of 'official' BOINC support, but the project still has the ability to release an app for ATI GPU's that uses the app_info.xml - anonamous route. So the question still is - Will the project release an official ATI app or will all the folks who have purchased ATI cards be left out in the cold?


"Official" = created and released by the project
"Anonymous Platform" = created by volunteer developers and made available through 3rd party download...or sometimes made available on the project's own web site (can't remember if SETI did / does that or not).

The only exception I know to this general rule is Einstein, who uses the anonymous platform mechanism for users to test upcoming releases of the offical code base...

That's why I am frustrated and getting blue in the face, because people don't seem to be understanding this...
ID: 22878 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 22895 - Posted: 21 May 2009, 14:08:13 UTC

I understand the frustration of people who invested money into ATI-cards because of BOINC(Milkyway) but in my eyes there are two fundamental misunderstandings

1. BOINC is meant to use spare cpu/gpu cycles to do something useful while not doing their intended work. So any investment dedicated to BOINC is the first basic mistake

2. Neither BOINC nor any project ever promised that any specific system will be supported. Even a complete project could be ended without any entitlement from the volunteers. So to bet on a specific system is the second basic mistake

For Milkyway i don't believe they will end support for ATI not for the non-gpu project and of course not for the gpu project. Cluster Physik already said that he'll work on a ATI app for the gpu project perhaps until the weekend.

I see absolutely no reason for a project to release an app for hardware not supported by BOINC such work will use a lot more project resources than standard apps and is very likely to not work anymore when the hardware will be supported by BOINC, so work has to be done twice. If i take into account how long it took to make the gpu-project, i can't see any free resources for an official ATI-app.

To focus on something not officially supported always includes the risk that it will not work anymore in the future. If you want to go with this risk - do it, if it is too much risk - let it be. But do not blame anyone else for decisions you made for your own.

will all the folks who have purchased ATI cards be left out in the cold?
They will be left exactly there where they were left all the time.
ID: 22895 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22902 - Posted: 21 May 2009, 14:53:47 UTC - in response to Message 22895.  


For Milkyway i don't believe they will end support for ATI not for the non-gpu project and of course not for the gpu project.


Actually, at least in theory, GPU apps should no longer be able to process CPU tasks once all this is said and done, meaning a CUDA app is available at the GPU project as well as either Brook/CAL or OpenCL. The worst possible scenario is to allow a credit level that is higher here with a GPU processing a CPU task than a GPU processing a GPU task at the GPU project. This would mean that there would be no incentive for people with GPUs to move to the other project and so we'd end up with the heavy consumption of tasks continuing here and little to no utilization of the GPU project.
ID: 22902 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileGalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
100 million credit badge13 year member badge
Message 22903 - Posted: 21 May 2009, 14:57:30 UTC - in response to Message 22895.  

I understand the frustration of people who invested money into ATI-cards because of BOINC(Milkyway) but in my eyes there are two fundamental misunderstandings

1. BOINC is meant to use spare cpu/gpu cycles to do something useful while not doing their intended work. So any investment dedicated to BOINC is the first basic mistake

2. Neither BOINC nor any project ever promised that any specific system will be supported. Even a complete project could be ended without any entitlement from the volunteers. So to bet on a specific system is the second basic mistake

For Milkyway i don't believe they will end support for ATI not for the non-gpu project and of course not for the gpu project. Cluster Physik already said that he'll work on a ATI app for the gpu project perhaps until the weekend.

I see absolutely no reason for a project to release an app for hardware not supported by BOINC such work will use a lot more project resources than standard apps and is very likely to not work anymore when the hardware will be supported by BOINC, so work has to be done twice. If i take into account how long it took to make the gpu-project, i can't see any free resources for an official ATI-app.

To focus on something not officially supported always includes the risk that it will not work anymore in the future. If you want to go with this risk - do it, if it is too much risk - let it be. But do not blame anyone else for decisions you made for your own.

will all the folks who have purchased ATI cards be left out in the cold?
They will be left exactly there where they were left all the time.

That last sentance is correct at least ;) However, BOINC is meant to do a number of things, and one of them is to attract people to donate their computers free of charge. Saying that you need to do no more than contribute your spare cycles is just one way of selling a punter the idea of BOINCing. After that, if you want to run quad cores, and a number of them, exclusively for BOINC, then that's up to you. But certainly no basic mistake.

No-one should be concerned about having no use for a PC (or many) if a projects goes under, as many have. We just switch to another project (or some). No mistake there either for anyone to choose any project and worry about it going bust.

So what about ATI cards? Is it a mistake to buy one now? I can't say, I'm unlikely to buy any more, my buying contribution there is done. Was it a mistake to buy ATI cards? I can only speak for myself; I have millions of credits generated in a very short time, and I am turning off electricity hungry CPU crunchers all the time as I concentrate loads'a'crunch into fewer boxes (actually my initial idea was to make do with one box, but then maybe I want more than 'just enough' :)
..... and the yellow brick road just seems to go on and on and on.....

For those who went for ATI at the start I'd say it was no mistake, and for me an interesting journey to get caught up with them in MW crunching. For those thinking about it, you can see what's gone before and what might go from now. As for me, I will have ATI and zillions of credits whatever happens.


ID: 22903 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 22910 - Posted: 21 May 2009, 17:13:06 UTC - in response to Message 22903.  

Ice wrote:
That last sentance is correct at least ;) However, BOINC is meant to do a number of things, and one of them is to attract people to donate their computers free of charge. Saying that you need to do no more than contribute your spare cycles is just one way of selling a punter the idea of BOINCing. After that, if you want to run quad cores, and a number of them, exclusively for BOINC, then that's up to you. But certainly no basic mistake.
You're right, it's not a mistake doing it, but as you said it's up to the one who's doing and not up to BOINC or the project.

No-one should be concerned about having no use for a PC (or many) if a projects goes under, as many have. We just switch to another project (or some). No mistake there either for anyone to choose any project and worry about it going bust.
That's the reason for my answer, people here sound like they're very concerned of the work the project has available, so in both points the mistake or better the misunderstandings are the expectations people have and what they think about BOINC/projects expectations.

Sorry i lost the topic a little. I meant, as i said in the first sentence, misunderstandings. English isn't my primary language, so it's sometimes hard to say what i really want to.

So what about ATI cards? Is it a mistake to buy one now? I can't say, I'm unlikely to buy any more, my buying contribution there is done. Was it a mistake to buy ATI cards? I can only speak for myself; I have millions of credits generated in a very short time, and I am turning off electricity hungry CPU crunchers all the time as I concentrate loads'a'crunch into fewer boxes (actually my initial idea was to make do with one box, but then maybe I want more than 'just enough' :)
I know of your other postings that you take the things here as they come and live with it, that's very OK for me, what is not OK for me is that people make decisions for whatever reason and if it doesn't work like they want they search someone/something guilty like BOINC, the projects, ATI-cards, GPUs alltogehter and so on. They only leave out the only one who is really responsible - themselves.

..... and the yellow brick road just seems to go on and on and on.....
sorry, i don't understand this one.

For those who went for ATI at the start I'd say it was no mistake, and for me an interesting journey to get caught up with them in MW crunching. For those thinking about it, you can see what's gone before and what might go from now. As for me, I will have ATI and zillions of credits whatever happens.
for me buying ATI was a big mistake, but not because of BOINC, i'm very happy with having them crunching. But i didn't buy them because of BOINC i didn't know i could use them for BOINC at first. I bought them because of my business where i need the possibility to watch any sort of content. The mistake with ATI is because of their crappy drivers, i have big problems running two 24" Monitors (if that is really the problem, but i've eliminated everything else, i think)


Brian Silvers wrote:
Actually, at least in theory, GPU apps should no longer be able to process CPU tasks once all this is said and done, meaning a CUDA app is available at the GPU project as well as either Brook/CAL or OpenCL. The worst possible scenario is to allow a credit level that is higher here with a GPU processing a CPU task than a GPU processing a GPU task at the GPU project. This would mean that there would be no incentive for people with GPUs to move to the other project and so we'd end up with the heavy consumption of tasks continuing here and little to no utilization of the GPU project.
Why should they drop ATI support? We will see what happens, and then decisions could be made. To make decisions based on some paranoia of worst case scenarios was never a good way. I believe thing will sort out themselves. I don't know if MW can crunch all work it can send only by cpus, if not it would be stupid to drop a second source. I don't know how MWgpu goes, perhaps it's saturated with cuda-cards. Perhaps the best way is to split the available ATI-cards between the projects. And perhaps MWgpu performs that great and uses all available resources that spending resources on a second project isn't possible anymore and old MW will be dropped. I don't even know if it's easily possible to run ATI at the MWgpu as the ATI-app is recognized as cpu-app, there are no ATI-gpu in the BOINC world. Questions over questions that only could speculated about.
ID: 22910 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileMartin Chartrand
Avatar

Send message
Joined: 25 Mar 09
Posts: 65
Credit: 53,099,671
RAC: 0
50 million credit badge12 year member badge
Message 22912 - Posted: 21 May 2009, 17:22:44 UTC - in response to Message 22903.  

I have mixed emotions about this but I have to agree with Ice.


So what about ATI cards? Is it a mistake to buy one now? I can't say, I'm unlikely to buy any more, my buying contribution there is done.


Same here! My contribution, monetary speaking, is now done for the moment that is and will just wait what the future months bring.
For now I have a 4870 downstairs that has done as much work in a month, than it took my 2 computers to do in 1 year.

Having 2 gaming system and having been a "NVIDIA" guy all my life the decision to purchase a 4870 was based on both the gaming power and the crunching ability of this specific card.

And a GTX285 upstairs. So right now I am covered on the 2 different routes whether ATI or GPU will win the crunching battle.
My GTX285 was purchased for my gaming addiction at 1st shooter games and RPG's like Far Cry2 and Half Life and so on.

I had a 8600GT in the downstairs computer so it was time to make a decision and I chose to spend the money for a 4870.

Having said all that, I also admit it is gratifying in knowing one can contribute so much credit in so little time and the credit versus power consumption is well worth it.

Heck, it took me 2 year to get to 11 000 credit when I first started crunching for SETI with my now defunct Celeron. Needless to say the credit versus power consumption then was just abhorrent.

But science tends to go in one direction only and that is upward.
I wish it would be the same for my investment...

I think those who have spend money for whatever hardware should be rewarded as far as crunching/credits. In the same token let's not forget the ones who simply cannot afford better hardware and can only hope that an app will help them also.
Not too long ago I for one who could not afford a dime. But time changes.
Would I wish to have a farm of computers? Heck yeah. But this is simply not an option for me.

Martin Chartrand
ID: 22912 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22921 - Posted: 21 May 2009, 19:15:17 UTC - in response to Message 22910.  


Brian Silvers wrote:
Actually, at least in theory, GPU apps should no longer be able to process CPU tasks once all this is said and done, meaning a CUDA app is available at the GPU project as well as either Brook/CAL or OpenCL. The worst possible scenario is to allow a credit level that is higher here with a GPU processing a CPU task than a GPU processing a GPU task at the GPU project. This would mean that there would be no incentive for people with GPUs to move to the other project and so we'd end up with the heavy consumption of tasks continuing here and little to no utilization of the GPU project.
Why should they drop ATI support? We will see what happens, and then decisions could be made. To make decisions based on some paranoia of worst case scenarios was never a good way. I believe thing will sort out themselves. I don't know if MW can crunch all work it can send only by cpus, if not it would be stupid to drop a second source. I don't know how MWgpu goes, perhaps it's saturated with cuda-cards. Perhaps the best way is to split the available ATI-cards between the projects. And perhaps MWgpu performs that great and uses all available resources that spending resources on a second project isn't possible anymore and old MW will be dropped. I don't even know if it's easily possible to run ATI at the MWgpu as the ATI-app is recognized as cpu-app, there are no ATI-gpu in the BOINC world. Questions over questions that only could speculated about.


Apparently you didn't understand me...or human nature for that matter.

If a GPU task for an ATI card yields less credit / unit time with the GPU project, there will be a sizeable contingent of people who are simply only in it for the points that will stick around here and churn through tasks just like they are today, thus very little will have been gained from the months of work and months of complaining here...

There are two ways for the project to prevent this from happening:

1) Disallow GPUs from processing CPU tasks.
2) Reduce the credit for CPU tasks so that it is not as enticing to GPU users.

If you advocate the second option, you're going to piss off quite a few people.

Alternatively, if people who are in it for the points stick around here because they get more than what they can get there, then CPU users will be shafted again due to the same situation happening with no work availability, so the project may just as well have done what several GPU users advocated, get rid of all of us that use a CPU and go to a GPU-only project. Of course, doing that would've also meant that CUDA would've been first, so there would've been gnashing of fingers that way too, with all the "ATI should've been first"...

I swear...it really isn't this difficult to do conceptual thinking... Maybe it's because I'm a system admin that I tend to look for more than just the one "ideal" outcome. It's called "contingency planning".

ID: 22921 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileuBronan
Avatar

Send message
Joined: 9 Feb 09
Posts: 166
Credit: 27,520,813
RAC: 0
20 million credit badge12 year member badge
Message 22923 - Posted: 21 May 2009, 19:32:43 UTC

There is not one project which officially supports ati and that is still to me an issue, since i like to choose which one i want to use and was hoping finally one project would pick up the potential power of this gpu type.

So i based my recent buys on the performance in games and on the power consumption. I just am no longer looking at what boinc does, as i allready mentioned i don't really care anymore.

I must mention that in fact MW@home made me return to crunch since i saw finally a project which supported ati, now my old cards where to slow so i could not enter with them (HD3200)

I ordered 2 x 4770 ati cards 3 weeks ago since they are cheap, eat little power and give a very very good performance.

In fact i found this card in many cases beat a nvidia 250 or its siblings, so in my new crossfire setup i see a huge performance boost at my pc.

Ofcourse there are other projects then MW but they simply give me not enough points for the processing power i put into it.

On many fora i see some persons trying to get the project admins change the rewards to even lower then seti standard.

What concerns the gpu crunchers as i remember was the plan from the admins here that they are trying to get the gpu guys of the overstressed server for the cpu project to the new gpu project.

Simply said if the ati's stay on the cpu project we never will get normal amount of cpu-units and again see some days my cpu's doing nothing.

Since the server would not be released of the stress from those unit hungry monsters. So i guess a new application would be released to release the pressure on this server.

I just wanted to say to you guys that making an Ati app is not a given fact.
Since many where writing that ofcourse an ati app would be build, time will tell if the guys are able to build such application
I simply would love to see that come true.

On the other hand i allready found my new cards are not working with the released application.
Somehow all the units error out on all the different versions i tried, i asked around but untill today nobody has given me an asnwer which solved this.
Its new, its relative fast... my new bicycle
ID: 22923 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profilearkayn
Avatar

Send message
Joined: 14 Feb 09
Posts: 999
Credit: 74,932,619
RAC: 0
50 million credit badge12 year member badge
Message 22924 - Posted: 21 May 2009, 19:35:54 UTC - in response to Message 22923.  

Did you copy the aticalx.dll files and rename them to amdcalx.dll?
ID: 22924 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileGalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
100 million credit badge13 year member badge
Message 22926 - Posted: 21 May 2009, 19:39:13 UTC - in response to Message 22921.  

I swear...it really isn't this difficult to do conceptual thinking...

I'm sure you're right Brian. But I swear... there are no 'CPU' WUs and no 'GPU' WUs. There are just work units which the crunchers decide what to crunch with, since Cluster Physik gave us that choice...



ID: 22926 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22927 - Posted: 21 May 2009, 19:48:06 UTC - in response to Message 22926.  
Last modified: 21 May 2009, 19:50:56 UTC

I swear...it really isn't this difficult to do conceptual thinking...

I'm sure you're right Brian. But I swear... there are no 'CPU' WUs and no 'GPU' WUs. There are just work units which the crunchers decide what to crunch with, since Cluster Physik gave us that choice...



Now, yes. However, and this is where people apparently got lost, I wasn't talking about now. I was talking about a possible future.

If a future GPU task yields less credit than a future CPU task (current tasks here would end up being the future "CPU" tasks) if said future CPU task were to be processed by the current anonymous platform application for ATI GPUs, then human nature will take hold and will limit the participation in the GPU project by ATI card owners even if ATI support was available. Without a project-enforced method of blocking GPUs from the CPU project, GPUs will wander back over to the CPU project and bring the project right back to the point we're at today.

Since numbers are easier to visualize, I offer these hypotheticals:

Current task processed by GPU app - 300 cr/hr
Future task processed by GPU app at GPU project - 250 cr/hr
Future task processed by GPU app at CPU project - 300 cr/hr

You tell me where most of the GPU processing is going to be done at without limitations put in place?

Hope this is clearer...and it is most definitely not a far-fetched idea...
ID: 22927 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile[P3D] Crashtest

Send message
Joined: 8 Jan 09
Posts: 58
Credit: 52,485,036
RAC: 2,846
50 million credit badge12 year member badge
Message 22928 - Posted: 21 May 2009, 20:03:54 UTC - in response to Message 22927.  

We need bigger WUs ! (at least 40times of the current size or more)

If this project changes from DP to SP, ATI Cards (48xx) would only need about 5-8s for a WU of the 82type (like the one used by Travis with his nVIDA GPU - but he needed 1min !) --> this change whould cause that we will need more WUs than today !


Why should they change the Credits ?

Same Credits for Same Work !
10^20 Floaps Calc on GPU = 10^20 Floaps Calc on CPU !

Or they have to change the Credits for Multicore CPUs vs Singlecore CPU too !!!
ID: 22928 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
ProfileGalaxyIce
Avatar

Send message
Joined: 6 Apr 08
Posts: 2018
Credit: 100,142,856
RAC: 0
100 million credit badge13 year member badge
Message 22931 - Posted: 21 May 2009, 20:10:18 UTC - in response to Message 22927.  

Hope this is clearer...and it is most definitely not a far-fetched idea...

I hope with certainty that this *is* a far fetched idea. At the beginning when the ATI development started, I asked Travis the speific question as to whether he would drop the tariff for the GPU crunched work units. His answer was very specific, that he would not. Credits would be applied the same no matter how the WUs were crunched.

This 'risk' of diminished returns just because you were using a GPU was ruled out by Travis and I don't see why this should change. A work unit crunched should get the same credit no matter who or what crunched it.


ID: 22931 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MontagsMeeting

Send message
Joined: 12 Mar 09
Posts: 61
Credit: 9,214,340
RAC: 0
5 million credit badge12 year member badge
Message 22934 - Posted: 21 May 2009, 20:33:15 UTC - in response to Message 22921.  

Apparently you didn't understand me...or human nature for that matter.

If a GPU task for an ATI card yields less credit / unit time with the GPU project, there will be a sizeable contingent of people who are simply only in it for the points that will stick around here and churn through tasks just like they are today, thus very little will have been gained from the months of work and months of complaining here...

There are two ways for the project to prevent this from happening:

1) Disallow GPUs from processing CPU tasks.
2) Reduce the credit for CPU tasks so that it is not as enticing to GPU users.

If you advocate the second option, you're going to piss off quite a few people.

Alternatively, if people who are in it for the points stick around here because they get more than what they can get there, then CPU users will be shafted again due to the same situation happening with no work availability, so the project may just as well have done what several GPU users advocated, get rid of all of us that use a CPU and go to a GPU-only project. Of course, doing that would've also meant that CUDA would've been first, so there would've been gnashing of fingers that way too, with all the "ATI should've been first"...

I swear...it really isn't this difficult to do conceptual thinking... Maybe it's because I'm a system admin that I tend to look for more than just the one "ideal" outcome. It's called "contingency planning".
I understand your concerns very well, but totally disagree in almost any point.
credit is given by work not time
ATI people won't stay with a project that has no work available when there is a project that supports ATI too even if credit is a little less but available.
If credit will be less for MWgpu this is a calculation error and must be corrected or it is due to a better optimized application then the optimizations should be used with nonGPU MW too.
I think Murphy himself would pale with fear, if he would think in your way about things that can go wrong :D
ID: 22934 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Brian Silvers

Send message
Joined: 21 Aug 08
Posts: 625
Credit: 558,425
RAC: 0
500 thousand credit badge13 year member badge
Message 22935 - Posted: 21 May 2009, 20:37:24 UTC - in response to Message 22931.  

Hope this is clearer...and it is most definitely not a far-fetched idea...

I hope with certainty that this *is* a far fetched idea. At the beginning when the ATI development started, I asked Travis the speific question as to whether he would drop the tariff for the GPU crunched work units. His answer was very specific, that he would not. Credits would be applied the same no matter how the WUs were crunched.

This 'risk' of diminished returns just because you were using a GPU was ruled out by Travis and I don't see why this should change. A work unit crunched should get the same credit no matter who or what crunched it.


I don't so much as mean the situation would happen on purpose. I figure it could possibly happen as a result of getting things up and running. That said though, by the tone of some of the posts I've seen in the past about credits, perhaps as little as a 5-10 cr/hr difference in favor of the CPU side would be enough to entice some people. 10/hr is 240/day, and if the person has multiple machines and/or multiple cards, that's quite a bit...
ID: 22935 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profilebanditwolf
Avatar

Send message
Joined: 12 Nov 07
Posts: 2425
Credit: 524,164
RAC: 0
500 thousand credit badge14 year member badge
Message 22936 - Posted: 21 May 2009, 20:38:31 UTC - in response to Message 22926.  
Last modified: 21 May 2009, 20:39:11 UTC

I swear...it really isn't this difficult to do conceptual thinking...

I'm sure you're right Brian. But I swear... there are no 'CPU' WUs and no 'GPU' WUs. There are just work units which the crunchers decide what to crunch with, since Cluster Physik gave us that choice...


The only real difference is that the Gpu wu's are supposed to be 1000x longer, in which a typical computer can't do that size in the 3 day deadline.
Doesn't expecting the unexpected make the unexpected the expected?
If it makes sense, DON'T do it.
ID: 22936 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
1 · 2 · 3 · 4 · Next

Message boards : Number crunching : Future ATI Card Support?

©2021 Astroinformatics Group