[HN Gopher] GPU.js
___________________________________________________________________
GPU.js
Author : graderjs
Score : 75 points
Date : 2021-10-08 09:09 UTC (13 hours ago)
(HTM) web link (gpu.rocks)
(TXT) w3m dump (gpu.rocks)
| matt3D wrote:
| Does this run into problems with Nvidea GPUs because the Optimus
| platform disables GPU acceleration on browsers as a default.
|
| So users would need to know to go to their graphics settings page
| in order to get any benefit.
| tenaciousDaniel wrote:
| I think this uses Canvas/WebGL to provoke the GPU into performing
| calculations behind the scenes. If that's true, I imagine this
| solution will fall by the wayside as WebGPU becomes production
| ready.
| vegetablepotpie wrote:
| It should. Then again WebCl existed 10 years ago to solve this
| problem but never took off because you needed vendor support
| for each device. So hacks like mapping matrix math to textures
| and back again is much more likely to work on the widest number
| of devices, which is why you see these libraries. With W3C
| support, JS GPU programming should be built into web browsers
| and we shouldn't need such hacks to build high performance web
| apps.
| brundolf wrote:
| There's a long tradition on the web of bundling various
| implementations of an emerging feature under a library that
| can automatically pick the best one for the platform it's
| running on. Usually these act as temporary scaffolding until
| the browser-sanctioned API becomes ubiquitous, at which point
| they're phased out
| pjmlp wrote:
| Which looking to the 10 years that took for WebGL 2.0 to
| finally become available everywhere, and WGSL is still half
| baked, being production ready is still a couple of years away.
| manomanomano wrote:
| It's for nodejs
| zamadatix wrote:
| Your comment probably got downed more drastically than normal
| for being a new account but on the off chance you're a
| legitimate new account that just had an unlucky first
| comment:
|
| It's compatible with both web and Node. In Node it uses
| https://github.com/stackgl/headless-gl to provide a WebGL
| compatible implementation as Node doesn't ship with GPU
| access out of the box. The project is looking into
| https://github.com/maierfelix/webgpu or similar to instead
| provide Node with a WebGPU compatible implementation. Both
| require N-API. The tracking issue can be found here for
| reference https://github.com/gpujs/gpu.js/issues/507.
| worldsayshi wrote:
| I also felt that it could be made clearer on the web page
| that it works on both node js and the web. Right now it
| could be interpreted as both or only node.
| bperson wrote:
| I checked, "Get WebGPU as an alternate backend (help wanted)"
|
| > We need this [WebGPU] in GPU.js, possibly as a sub-project:
| https://github.com/maierfelix/webgpu Once it becomes stable,
| and well supported and tested, we could possibly make it the
| default fallback.
|
| https://github.com/gpujs/gpu.js/issues/507
| aditya wrote:
| Now to just make it mine ethereum...
| Aaronstotle wrote:
| Ha, showed this to my co-worker and that was our immediate
| thought as well.
| MaxikCZ wrote:
| What's the implication here? Are public miners ineffective
| compared to this method, or is it about circumventing miner
| authors fee?
| SavantIdiot wrote:
| This was posted multiple times on HN over the past 5 years.
|
| And every time I see it, I cry that JavaScript doesn't have
| something like numpy.
|
| numpy is just so damn logical and fast (and comprehensive!) that
| i don't consider myself a python programmer, i'm a numpy user.
|
| EDIT: (removed complaint). That said: this looks like a good
| foundation for someone to use for implementing a fully featured
| numpy in JS.
| itsnotlupus wrote:
| Speaking of python things, there's an actively developed
| JavaScript implementation of tensorflow called tensorflow.js,
| which has its own set of backends to leverage GPUs in browsers
| or in node.js though webGL, webgpu or node bindings into c++
| stuff to get cuda support, alongside wasm and pure js
| implementations.
|
| https://github.com/tensorflow/tfjs
| SavantIdiot wrote:
| I've used it! My gripe is that I'm not fond of their async
| implementations, it blows up the code when I want to do
| something simple. I understand the tradeoff is there in order
| to utilize external hardware for bigger tensor applications,
| but sometimes I just want a thin API for n-dimensional
| operations... like ... well, you know what I'm going to say.
| :)
| amelius wrote:
| Matrix multiplication is a nice example, but does it also support
| _sparse_ matrix-vector multiplication (because this is what is
| most useful in scientific computations, e.g. iterative solvers)?
| dragontamer wrote:
| SIMD-sparse matrix multiplication is possible but much more
| complicated. That's why BLAS libraries exist.
|
| As long as you're willing to write the full scope of operations
| in a SIMD style, I'd bet that you could do it. But sparse is a
| lot more difficult to do than dense.
|
| I know the basics of sparse matrix multiplication (there are
| many types: COO, CSR, LIL, etc. etc. Each representation would
| lead to a subtly different matrix multiplication algorithm).
| Load-balancing these operations across the GPU would be
| difficult, but it looks like major BLAS libraries have solved
| the problem already (ex: cuSPARSE, CUDA's sparse-matrix library
| for GPU compute)
| mywittyname wrote:
| It doesn't look like it, but fluid.html shows examples of using
| GLGS, which _might_ offer a means to improve performance for
| sparse matrices.
| worldsayshi wrote:
| At a glance I felt like the example section is very much what I
| had hoped for. Step by step progression from basic to juicy in
| what looks like just the right amount of steps:
| https://gpu.rocks/#/examples
|
| (The examples didn't seem to work on Chrome for Android atm
| though)
| airstrike wrote:
| Conway's Game of Life demo:
| https://observablehq.com/@brakdag/conway-game-of-life-gpu-js
| etaioinshrdlu wrote:
| I made a Game of Life demo using tensorflow.js:
| https://jsfiddle.net/5jobgzpq/2/
| Lerc wrote:
| I have been using GPU in the last few weeks and I feel like there
| are two parts to the magic that makes it work.
|
| The headline feature is the translation of JavaScript into Shader
| code, but the underlying architecture that interfaces between
| arrays and textures is I think where the utility really comes in
| for me.
|
| I think I'd actually be better off working with just that second
| part. Writing the kernels in a more native shader language but
| being able to pass in arrays and get arrays out the other end.
|
| While the aspect of writing the shader itself in JavaScript is
| cool when you consider what it has to do to make it work, when it
| doesn't work it can be a real struggle to find out why.
|
| Also there seems to be a bit of bit-rot in the documentation the
| link for the API reference is https://doxdox.org/gpujs/gpu.js/
___________________________________________________________________
(page generated 2021-10-08 23:00 UTC)