Message boards :
News :
Separation Runs ps_p_82_1s_dr8_4 and de_p_82_1s_dr8_4 Started
Message board moderation
Previous · 1 · 2
Author | Message |
---|---|
Send message Joined: 3 May 09 Posts: 11 Credit: 54,187,229 RAC: 0 |
I continue to get "An output buffer would exceed CL_DEVICE_MAX_MEM_ALLOC_SIZE" on the de_separation_12 and _13's on my ati radeon hd 4770. It appears to me that if you just check the code to see why I and I imagine the others continue to get this "An output buffer would exceed CL_DEVICE_MAX_MEM_ALLOC_SIZE", you would determine what the "integral size" has to be set to to avoid the error. It looks like your [I've stopped the "de_separation_12_3s_sscon_2" and "de_separation_13_3s_sscon_2" runs and restarted them with "3" at the end instead of "2", this time with integral sizes that are the same as the "11" runs. If the restarted runs run fine for you, then I guess the step sizes were the culprit. Let me know how they go for you] has not solved the problem. I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off. Thank you for your time and effort in trying to solve this annoying problem; it's appreciated. |
Send message Joined: 8 Feb 08 Posts: 261 Credit: 104,050,322 RAC: 0 |
Was just running a few of the new de_separation_12_3s_sscon_2 and de_separation_12_3s_sscon_3 workunits and they seem to work now for me (HD5850; XP) |
Send message Joined: 6 May 09 Posts: 217 Credit: 6,856,375 RAC: 0 |
This is very interesting - especially since the error rates on the new runs are really low. After a lot of digging through logs and looking things up, my best guess is that these runs are overflowing the memory on older GPUs, and that's why only a minority of runs are failing. Also, the "12" run has a larger number of stars to analyze in it than the "13", "11", or "9" runs, so that might be pushing the required memory just out of range. I won't know for sure unless I can run benchmark software on those exact GPU models. I really can't make the integral sizes any smaller than they are, so the runs will just have to keep going the way they are. I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off. This is an issue with the BOINC client, not Milkyway@home; the BOINC client asks Milkyway@home for work, then the server sends it out. It won't send work to a computer that doesn't ask for it. I believe that the BOINC client just released an update, you might want to try updating, or rolling back to a previous version. Resetting the project might also fix your problem. |
Send message Joined: 19 Jul 10 Posts: 624 Credit: 19,291,222 RAC: 2,151 |
After a lot of digging through logs and looking things up, my best guess is that these runs are overflowing the memory on older GPUs, and that's why only a minority of runs are failing. Well, my GPU is the oldest one you can use here and the new WUs work fine, had quite "a lot" (for this GPU) of sscon_3 WUs, no performance issues at all so far like I had yesterday. I was using this computer all day, so I would have noticed if it was unusable for over an hour like it was with the old WUs. |
Send message Joined: 19 Jul 10 Posts: 624 Credit: 19,291,222 RAC: 2,151 |
I changed the mw@home preferences to not accept the separation wu-s (Milkyway@Home Separation: no) but mw@home is ignoring the "no" and continues to send me the "separations". That needs to be checked into to see why it still sends out separation wu-s when I've turned it off. "Milkyway@Home Separation" is some sort of placeholder or something, as you can see on the Applications Page it's not in use (no applications available). To disable separation WUs you have to disable "MilkyWay@Home". |
Send message Joined: 3 May 09 Posts: 11 Credit: 54,187,229 RAC: 0 |
Thanks for the responses to the mw@home separation parameter setting; they're appreciated. |
Send message Joined: 8 Feb 08 Posts: 261 Credit: 104,050,322 RAC: 0 |
I had 1 single de_separation_13_3s_sscon_2 fail yesterday, but on closer look it was a resent from the day before, means it still had the old params. Since than no more failing WUs. Depending on the buffer size: Wouldn't it be helpful in situations like this to expland the error message to show how much buffer memory was needed and how much was available? Or even better add this to the start info together with 'Global mem size', 'Local mem size', 'Max const buf size' etc. |
©2024 Astroinformatics Group