Mind-bending new programming language for GPUs just dropped... - Code Report
Mind-bending new programming language for GPUs just dropped... - Code Report
Mind-bending new programming language for GPUs just dropped... - Code Report
Very interesting, and very early stage rn.
Yet, it runs on massively parallel hardware like GPUs, with near-linear speedup
What a bold claim...
@eveninghere @ruffsl that claim's correct. But so far it doesn't have great performance on a single core.
Sorry, how could it be correct? On that page there's no explanation on what they're measuring to begin with. No mention on the benchmark set up either. There are problems that can never scale linearly due to the reality of hardware.
The github blurb says the language is comparable to general purpose languages like python and haskell.
Perhaps unintentionally, this seems to imply that the language can speed up literally any algorithm linearly with core count, which is impossible.
If it can automatically accelerate a program that has parallel data dependencies, that would also be a huge claim, but one that is at least theoretically possible.
I hope the demo starts soon…
(What a bullshit correlation/equation to start with.)
Is this a PR? The link is PR with no substance, praises itself without any details on benchmarking setup, and still I see some comments here being positive.
This could be game changing for introducing shader programming to more developers if it pans out.
Yikes, those high CPU threads. Definitely needs some more polishing.
what's wrong with them? are you sure it's just not set to use 100% of all cores, and then the OS does some shuffling?
Futhark is another language with the same goals, executed differently.
Gotta read the paper, this is a game-changer.
Funny how they benchmarked an ARM CPU and not a x64 one as if ARM CPUs are now faster than x64 ones.