Welcome to MilkyWay@home

Posts by Furl Hawkens

1) Message boards : Number crunching : New Benchmark Thread - times wanted for any hardware, CPU or GPU, old or new! (Message 75011)
Posted 4 Feb 2023 by Furl Hawkens
So, bearing in mind that I am someone who normally is fine with detailed instruction like this for overclocking and benchmarking and such...

These are restrictions that in allot of cases dont make much sense on modern HW any more...

That said, I will add some results:

//Windows 10, R9-5950x (PBO On, Core Optimizer off, AiO), Radeon RX-6900xt (stock)

Machine is run with full load on GPU and stock 0.05 CPU thread per GPU, with full CPU loading including on SMT.

Milkyway@home Separation v1.46 (opencl_ati_101) windows_x86_64 times (GPU) : ~25-35s per work unit with 1 unit on GPU, but GPU had lots of idle time and bursts of active time. ~90 seconds per work unit with 4WU on the GPU at once.
Milkyway@home Separation v1.46 windows_x86_64 (CPU) - 2,600 Seconds both real and CPU time for compute, seems to run single threaded with up to 32 simultaneous possible at once.
Milkyway@home N-Body Simulation v1.82 (mt) windows_x86_64 - all over the place as these tend to get paused and such. But from the runs where it gets to go solid for awhile I am seeing ~3-5k seconds real world time on 16 compute threads for 50-70k seconds CPU time.

// Have two more rigs, will post later, lazy now haha, just off work an all.

Also, (general Q, not at OP) where are the GPU WU's at? They seem to have dried up!? lol
2) Message boards : Number crunching : Cooling Problems (Message 49948)
Posted 6 Jul 2011 by Furl Hawkens
OK, let me try to give you guys some answers.

First off, I'm very PC-centric and know the issues are not related to other cooling systems.

The long and short of it is that XFX skimped out on cooling. They used inferior materials in the HS/F configuration. I actually questioned them, after pulling the AMD/ATi stock stated cooling system, as to why theirs was made with lesser materials (ie painted aluminum where AMD says there should be brushed copper...)

They told me they could not give an answer, but that they would not have made the card with improper cooling. yeah right lmao.

Stock speeds for my card are 875/975. With all settings on auto the driver will never raise the fan above 65% resulting in an overheat with stressful applications (like furmark, OCCT, looping 3D mark, or MW@H).

If I turn the fan up its insanely loud, and even @ 100% it will not keep the card cool enough (ie below 90c) when its running the above mentioned programs.

I can under-clock it, and the temps get under control, but I have to go down to 500/675 before it stays cool enough on auto fan control, and at that low of a setting I begin to get computational errors, which defeats the whole purpose...

As for other cooling, heres the full system

Apevia Full tower case, 5x120MM case fans (1 in front, 1 in top, 1 in side, 2 out back.

CPU is a Core 2 Quad Q6700 @3.6 with a coolermaster Hypere 212+, and 2x120mm (90CFM) fans in push/pull.

CPU runs ~30c idle, ~60c load

Northbridge runs 40c idle 50c load

GPU runs ~68c idle 85-97c load, sometimes passing 100c and crashing.

GPU VRM's (which are what I think is actually overheating) run at 80-90c idle, 100-130c load. The cooling is an out the back exhaust design, so not allot of hot air being recirculated.

I have already tried changing the TIM on the card (with XFX's ok), it helped about 3 degrees, but not much.

When I upgrade the card in a year or two I plan on making XFX warranty it out. I also hope that by then I might get something a bit nicer. 4890's will be old by then, and they have to replace it with either a 4890 or something "comparable in performance", maybe a newer card?...
3) Message boards : Number crunching : Time for a graphic visualizer?... (Message 49886)
Posted 4 Jul 2011 by Furl Hawkens
thanks for the link. After skimming through the beginning of that thread it looks like the thing most are concerned about would be the impact that a visualizer would have on GPU computing.

I know that I use one of my GPUs for MW@H and will run the SETI visualizer as my screen saver/background.

I have had not issues with this, nor any discernible decrease in computational power of my system. This is on a weaker system by modern standards, and thus would reason that more powerful systems would have even less of an impact.
4) Message boards : Number crunching : GPU Requirements (Message 49885)
Posted 4 Jul 2011 by Furl Hawkens

As for why I'm not using my 4890, only the 3850, its cuz the 4890 overheats. It always has unfortunately,

MW@H is very low dependant on memory-clock; that means, you can 'underclock' your card without significant loss of credit, but remarkable loss of heat. I use 'Afterburner' for that.

There were several discussions in the past. Many crunchers tested that, it works.

already tried. If I underclock my gpu, mem, or both, to points that are thermaly acceptable, I begin to get constant corruption of the data thats being processed.

I cant get it low enough, and have it stay stable enough, for DP calculations apparently...
5) Message boards : Number crunching : Time for a graphic visualizer?... (Message 49873)
Posted 4 Jul 2011 by Furl Hawkens

So I was thinking about MW@H and it seems like it would be the prime candidate for a low resource visual interface.

SETI is an example of a competing project with something similar. It has a display that can be shown on demand as well as used as a screen saver.

It would seem that MW@H would be even better to have a visualizer than SETI.

Perhaps a screen saver where we are seeing the MW galaxy forming as the computer is seeing it, or something similar?...

Perhaps something like what is shown here ( http://milkyway.cs.rpi.edu/milkyway/forum_thread.php?id=2491 ) except rather than a sim of them colliding it could be a visual representation of whatever the computer is simulating about the MW galaxy at the time...

Are there any other ideas for something like this, or is this something that people would be interested in?...

I know that I would be willing to help with this if you guys want it (but im not sure what help I could be, im not a programmer or anything lol).

6) Message boards : Number crunching : GPU Requirements (Message 49871)
Posted 4 Jul 2011 by Furl Hawkens
Thanks for the info on running the GPU always, though I had already set it. I have been crunching with Boinc for a few years now, just new to MW@H lol.

As for why I'm not using my 4890, only the 3850, its cuz the 4890 overheats. It always has unfortunately,

Its an XFX Radeon HD4890 XXX edition, with a "lifetime warranty". When running stress tests, or distributed computing projects (ie anything that actually pushes it) it will completely overheat (ie pass 100c and lock the comp). It does ok for games, but still averages in the 95c mark when playing things like crysis and GTA-IV.

When I contacted XFX they actually told me that the card was never designed to be pushed the way that furrmark or DC projects do. They said that unless it started to fail from overheating with normal PC games that they refused to warranty it out. They straight out told me not to run the programs that are overheating it lmao...

This was when the card was brand new, still has the issues, still holding me back, and they still refuse to warranty it...

All this was through email, so I still have it in writing, if anyone here wants to see.

My advice, avoid XFX.
7) Message boards : Number crunching : GPU Requirements (Message 49806)
Posted 2 Jul 2011 by Furl Hawkens
Hey! Wanted to mention AGP bus cards. I am currently running an ATi HD3850 on an AGP bus with the most recent CCC, drivers, and AMD/ATi AGP bus patches.

Though this card does not * officially* support Stream or Open CL on an AGP form factor (the 3850 PCIe does), it seems to perform well.

I am currently running MW@Home on my AGP rig (2.8Ghz Pentium Dual core, Radeon HD3850 AGP) and am currently running out an average credit of ~10,000 with a turnaround time of 0.09days. This is using BOINC. I am unsure if its using ATi Stream or Open CL.

Joined MW@H 4 days ago, and have already racked up 123,000 credits on my AGP computer.

Long and short of it:

Please add the HD38XX AGP series as a supported GPU.

Also, if you have an older AGP number cruncher, and want to add some punch, now you know what to pick up lol. :)


©2023 Astroinformatics Group