Welcome to MilkyWay@home

Milkyway GPU on Mac?

Message boards : Number crunching : Milkyway GPU on Mac?
Message board moderation

To post messages, you must log in.

AuthorMessage
Martin P.

Send message
Joined: 21 Nov 07
Posts: 52
Credit: 1,756,052
RAC: 0
Message 46310 - Posted: 19 Feb 2011, 22:29:37 UTC

What graphiccard can I use and where can I buy it?

Thanks!

ID: 46310 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Matt Arsenault
Volunteer moderator
Project developer
Project tester
Project scientist

Send message
Joined: 8 May 10
Posts: 576
Credit: 15,979,383
RAC: 0
Message 46311 - Posted: 19 Feb 2011, 22:32:42 UTC - in response to Message 46310.  

What graphiccard can I use and where can I buy it?

Thanks!
Apple's OpenCL implementation seems to not support doubles on the GPU. If/when that happens, there will be one. Until then, I have one horribly hacky idea that has a very small chance of working which I'll try at some point.
ID: 46311 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Profile NullCoding*
Avatar

Send message
Joined: 23 Sep 10
Posts: 24
Credit: 58,711,243
RAC: 0
Message 46339 - Posted: 22 Feb 2011, 1:49:50 UTC

That, or we can wait a bit until NViDIA releases its new Quadro GPUs designed for the Mac Pro.

Perhaps when Apple gets around to putting decent GPUs in the iMacs and MacBook Pros we'll see double-precision finally come to Macs. The GTX 5xxM series is a good bet, although they are limited by Apple's obligatory downclocking and removal of memory. So even if you had a MBP with a GTX 555M, you aren't likely to have the full 1536MiB of memory and 590/1180/1800 clocks.

For instance, my GT 330M is compute 1.3, so no DP and thus no MW; it also has 256MiB of memory instead of the 1024 seen on the OEM version. That limits the amount of projects in which I can participate even more.

So bottom line: CUDA on Mac has a long way to go. Lucky for us GPU enthusiasts that NViDIA so avidly supports coding of CUDA apps on whatever architecture. Now we just need Apple to be more on-board and we'll see what happens.

ATi, on the other hand, I'm not sure about. I know that Mac Pros have ATi cards, shipping with 5850(s) last I checked, but I've not actually met anyone who has such a machine and uses it for BOINC.

I imagine Mac GPU crunching, while still in its infancy, will probably progress fairly fast over the next year (maybe less). Who knows.

--

Apple has their own ideas about how to best implement open standards...gotta love it...

Matt, what's your "horribly hacky" idea? Does it involve hacking some coding in a Mac GPU client/app? Because I can see how that could end up "horribly." ;)


ID: 46339 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote
Matt Arsenault
Volunteer moderator
Project developer
Project tester
Project scientist

Send message
Joined: 8 May 10
Posts: 576
Credit: 15,979,383
RAC: 0
Message 46341 - Posted: 22 Feb 2011, 15:28:34 UTC - in response to Message 46339.  

For instance, my GT 330M is compute 1.3, so no DP and thus no MW; it also has 256MiB of memory instead of the 1024 seen on the OEM version. That limits the amount of projects in which I can participate even more.
I have the same on my laptop; it's really annoying for working on this when not at my desktop.

Apple has their own ideas about how to best implement open standards...gotta love it...
It annoys me most that Apple's OpenCL compiler seems to be a proprietary fork of clang (though that seems to be more or less the case with the Nvidia and ATI, except ATI doesn't use clang but they do use LLVM). A few things in the CL compiler are nowhere to be found in clang, but Apple clearly added them.

Matt, what's your "horribly hacky" idea? Does it involve hacking some coding in a Mac GPU client/app? Because I can see how that could end up "horribly." ;)
The issue is Apple's OpenCL didn't support doubles on an ATI GPU when I tried to use it (I haven't tried on any Nvidia + Doubles OS X system). My horrible idea is that the actual binary that runs on the GPU shouldn't be any different than one anywhere else, so I was thinking I could try compiling ATI GPU binaries on Linux, and then figure out the format and whatever special pieces interact with the driver, and then inject core pieces of the binary compiled on Linux (ATI's OpenCL kernel binary format seems to be partially documented at least, not sure about Apple's). I don't see the chance of this working being very high. If it does it will probably be a lot of work to figure out exactly what to do.
ID: 46341 · Rating: 0 · rate: Rate + / Rate - Report as offensive     Reply Quote

Message boards : Number crunching : Milkyway GPU on Mac?

©2024 Astroinformatics Group