Welcome to MilkyWay@home

Posts by Jaguar

1) Message boards : News : MilkyWay@home screensaver coming soon (Message 41152)
Posted 30 Jul 2010 by Jaguar
Post:
you can use a recursive gaussian filter and the performance will not change with the size of the filter.

if you are going for a low IQ , high speed filter then use a box filter and precompute as much as you can.
2) Message boards : Number crunching : No Milkyway with GTX480 (Message 41099)
Posted 27 Jul 2010 by Jaguar
Post:

You are right that its within NVidia's capability to pull a rabbit out the hat and jump back in - they are a good Company at heart - but the sheer realities of chip development mean thats not going to happen for 4 or 5 years. They dropped the ball big time, and first have to salvage what they can from "Fermi" before committing to "son of Fermi", sheer cost pressures dictate that, let alone fabrication which they still dont have a grip on.

ATI let NVidia into the market 15/20yrs ago because ATI got complaicent, NVidia have just returned the honours ..... (!)

I hope NVidia do pull out a rabbit in 4/5 years time, else the dynamics of competition die out, and we all lose big time with inflated price performance, and increased development cycles with no viable competition.

i'm not so sure if you are familiar with VLSI design but nvidia could be back on top at 28nm and if things go well with their silicon spin they could have a nice 40nm gpu too.

there is nothing wrong with the fermi architecture and there is nothing wrong with r600 architecture. the physical implementation of these chips is terrible because the time for that stage was spent designing logic. the real killer of a chip is time, not making a bad architectural choice.

The big unknown is the move to combine cpu & gpu onto one chip (again - what goes around comes around!), and Intel are coming into play on the sidelines because of it. This migration back to one chip could well sink NVidia in the long term at consumer level at least, if not all levels, as it has no inherent cpu base.

My money is on the main long term term market being an ATI/AMD v Intel fight in 5 years going forward, with the two of them fighting over going back to one chip - a combined cpu/gpu. NVidia will still be there .... maybe .... but it would be the last throw at consumer level before they abandone consumer cards and go back to CUDA compute level only.

Regards
Zy

the relentless goal of computers over the past decades has been to make them smaller and faster. nvidia can become a major player in the mobile market with ARM. also i dont see a reason they could not go x86 as cyrix didnt have a license.




©2024 Astroinformatics Group