Welcome to MilkyWay@home

Separation Runs ps_p_82_1s_dr8_4 and de_p_82_1s_dr8_4 Started

Message boards : News : Separation Runs ps_p_82_1s_dr8_4 and de_p_82_1s_dr8_4 Started
Message board moderation

To post messages, you must log in.

Previous · 1 · 2

AuthorMessage
MBark

Send message
Joined: 3 May 09
Posts: 11
Credit: 54,187,229
RAC: 0
Message 58005 - Posted: 25 Apr 2013, 10:07:51 UTC

I continue to get "An output buffer would exceed CL_DEVICE_MAX_MEM_ALLOC_SIZE" on the de_separation_12 and _13's on my ati radeon hd 4770.

It appears to me that if you just check the code to see why I and I imagine the others continue to get this "An output buffer would exceed CL_DEVICE_MAX_MEM_ALLOC_SIZE", you would determine what the "integral size" has to be set to to avoid the error.

It looks like your [I've stopped the "de_separation_12_3s_sscon_2" and "de_separation_13_3s_sscon_2" runs and restarted them with "3" at the end instead of "2", this time with integral sizes that are the same as the "11" runs. If the restarted runs run fine for you, then I guess the step sizes were the culprit. Let me know how they go for you] has not solved the problem.

I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off.

Thank you for your time and effort in trying to solve this annoying problem; it's appreciated.
ID: 58005 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Len LE/GE

Send message
Joined: 8 Feb 08
Posts: 261
Credit: 104,050,322
RAC: 0
Message 58006 - Posted: 25 Apr 2013, 10:24:36 UTC

Was just running a few of the new de_separation_12_3s_sscon_2 and de_separation_12_3s_sscon_3 workunits and they seem to work now for me
(HD5850; XP)
ID: 58006 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Matthew
Volunteer moderator
Project developer
Project scientist

Send message
Joined: 6 May 09
Posts: 217
Credit: 6,856,375
RAC: 0
Message 58010 - Posted: 25 Apr 2013, 16:41:57 UTC

This is very interesting - especially since the error rates on the new runs are really low.

After a lot of digging through logs and looking things up, my best guess is that these runs are overflowing the memory on older GPUs, and that's why only a minority of runs are failing. Also, the "12" run has a larger number of stars to analyze in it than the "13", "11", or "9" runs, so that might be pushing the required memory just out of range. I won't know for sure unless I can run benchmark software on those exact GPU models.

I really can't make the integral sizes any smaller than they are, so the runs will just have to keep going the way they are.

I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off.


This is an issue with the BOINC client, not Milkyway@home; the BOINC client asks Milkyway@home for work, then the server sends it out. It won't send work to a computer that doesn't ask for it. I believe that the BOINC client just released an update, you might want to try updating, or rolling back to a previous version. Resetting the project might also fix your problem.
ID: 58010 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Link
Avatar

Send message
Joined: 19 Jul 10
Posts: 572
Credit: 18,833,698
RAC: 665
Message 58013 - Posted: 25 Apr 2013, 17:12:14 UTC - in response to Message 58010.  

After a lot of digging through logs and looking things up, my best guess is that these runs are overflowing the memory on older GPUs, and that's why only a minority of runs are failing.

Well, my GPU is the oldest one you can use here and the new WUs work fine, had quite "a lot" (for this GPU) of sscon_3 WUs, no performance issues at all so far like I had yesterday. I was using this computer all day, so I would have noticed if it was unusable for over an hour like it was with the old WUs.
ID: 58013 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Link
Avatar

Send message
Joined: 19 Jul 10
Posts: 572
Credit: 18,833,698
RAC: 665
Message 58014 - Posted: 25 Apr 2013, 17:16:42 UTC - in response to Message 58005.  

I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off.

"Milkyway@Home Separation" is some sort of placeholder or something, as you can see on the Applications Page it's not in use (no applications available). To disable separation WUs you have to disable "MilkyWay@Home".
ID: 58014 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
MBark

Send message
Joined: 3 May 09
Posts: 11
Credit: 54,187,229
RAC: 0
Message 58018 - Posted: 25 Apr 2013, 19:03:32 UTC

Thanks for the responses to the mw@home separation parameter setting; they're appreciated.
ID: 58018 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Len LE/GE

Send message
Joined: 8 Feb 08
Posts: 261
Credit: 104,050,322
RAC: 0
Message 58026 - Posted: 26 Apr 2013, 20:17:04 UTC
Last modified: 26 Apr 2013, 20:17:39 UTC

I had 1 single de_separation_13_3s_sscon_2 fail yesterday, but on closer look it was a resent from the day before, means it still had the old params.
Since than no more failing WUs.

Depending on the buffer size: Wouldn't it be helpful in situations like this to expland the error message to show how much buffer memory was needed and how much was available? Or even better add this to the start info together with 'Global mem size', 'Local mem size', 'Max const buf size' etc.
ID: 58026 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Previous · 1 · 2

Message boards : News : Separation Runs ps_p_82_1s_dr8_4 and de_p_82_1s_dr8_4 Started

©2024 Astroinformatics Group