Advanced search

Message boards : Number crunching : NVIDIA Eyes Post-CUDA Era of GPU Computing

Author Message
Profile Carlesa25
Avatar
Send message
Joined: 13 Nov 10
Posts: 328
Credit: 72,619,453
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 22660 - Posted: 9 Dec 2011 | 15:26:48 UTC


Hello: The future of programming. OpenACC allow developers to leverage the parallel acceleration, with only minor modifications to your existing code (or new)for GPUs. Greetings.

http://www.hpcwire.com/hpcwire/2011-12-07/nvidia_eyes_post-cuda_era_of_gpu_computing.html

Michael Kingsford Gray
Avatar
Send message
Joined: 28 Nov 11
Posts: 21
Credit: 121,646,463
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwat
Message 22872 - Posted: 1 Jan 2012 | 4:15:32 UTC - in response to Message 22660.
Last modified: 1 Jan 2012 | 4:16:20 UTC

At bloody last.

They are (or seem to be) including parallel FORTRAN as a contributor language!
It as so vastly superior to enormously parallel vast math tasks when compared to the syntactically ambiguous C or C++ or CUDA, than micro-photo-lithography is superior to a stone chisel.

Those mathematicians who have programmed (say) a Cray in parallel FORTRAN for some safety-critical task can attest to the veracity of this assertion.
Those who have not should reserve comment, positive or negative.

Post to thread

Message boards : Number crunching : NVIDIA Eyes Post-CUDA Era of GPU Computing

//