Post AUQFImag8gMlLebIsi by benmschmidt@vis.social
(DIR) More posts by benmschmidt@vis.social
(DIR) Post #AUQFIWD32lxOXjQCpM by benmschmidt@vis.social
2023-04-06T17:03:32Z
0 likes, 1 repeats
Big day: Chrome just shipped WebGPU without flags. Someone asked me to ELI5 what this means, and I'm X-posting it here--it's more important than you'd think for both visualization and ML people.GPUs are processors that every modern computers. GPUs are processors pretty much every computer/device has: they're weaker than CPUs, but you have lots of them so they can do the same job in parallel. The "G" is for graphics, but it's turned out they're good for much more:https://developer.chrome.com/blog/webgpu-release/
(DIR) Post #AUQFIYPAseqTLv6v0S by benmschmidt@vis.social
2023-04-06T17:07:34Z
0 likes, 0 repeats
The whole AI hype (and the crypto hype, too) of the last decade runs on GPU clusters that use GPU code not to render pixels on a screen in parallel, but to do lots and lots of matrix multiplication. GPUs are all slightly different, you need a language to write for them in: in most of the computational uses, that's been a language called CUDA, which is tightly tied to NVIDIA hardware. (There are also graphics languages for your OS, but that's a different thing.) Installing CUDA drivers SUCKS.
(DIR) Post #AUQFIaLLfqyPMdKtJw by benmschmidt@vis.social
2023-04-06T17:11:16Z
0 likes, 0 repeats
In the world of the web, accessing the GPU has only been through something called WebGL. It's old, and while you can do some neat stuff with it, TBH probably *you* can't, unless you're https://mathstodon.xyz/@rreusser, because it requires crazy hacks. And it's fundamentally built for graphics, not for the matrix-multiplication type stuff that is the bread and butter of deep learning models.
(DIR) Post #AUQFId6vNwDZwlbukq by benmschmidt@vis.social
2023-04-06T17:14:20Z
0 likes, 0 repeats
Since WebGL launched in 2011, lots of companies have been designing better languages that only run on their particular systems--something called Vulkan for Android, Metal for iOS, etc. Using these are insanely good if you're developing for just one platform, but they're even harder to run everywhere than CUDA. WebGPU is a design that sits on top of all these super low-level languages and allows people to write GPU code that runs on most computers/phones out there.
(DIR) Post #AUQFIgvjBVddnCP1rE by benmschmidt@vis.social
2023-04-06T17:17:30Z
0 likes, 0 repeats
And crucially, it has this called "compute shaders" that lets you write programs that take data and turn it into other data. Working with data in WebGL is really weird--you have to do things like draw to an invisible canvas and then read the values from it. In WebGPU, it's possible to actually just write code that does math. That means it's actually capable of doing--say--inference on a machine-learning model, multiplications on data frames, etc.
(DIR) Post #AUQFIineEWMbaidbXc by benmschmidt@vis.social
2023-04-06T17:22:34Z
0 likes, 0 repeats
There are already some crazy things out there, like a version of Stable Diffusion that runs in your web browser. https://github.com/mlc-ai/web-stable-diffusion. I wrote something two years ago about why I think WebGPU makes javascript the most interesting programming language out there for data analysts/ML people. https://benschmidt.org/post/2020-01-15-webGPU/ And there's lots of stuff that's become more conceivable since--for instance, a dataframe based on Apache Arrow that does massively parallel data transformations. Chat-GPT in browser.
(DIR) Post #AUQFIkYThtPx2FYWAq by benmschmidt@vis.social
2023-04-06T17:30:13Z
0 likes, 0 repeats
Right now it's only released on Chrome, but it's not an only-Google thing forever. It's an honest-to-goodness W3C standard like HTML, CSS, or SVG. All the browsers have been working on it; Chrome is just shipping first because they're insanely well funded compared to Safari and Firefox. (One of my favorite parts about reading the minutes of the WebGPU committee--yes, that's how insanely interested I am in this--is seeing the other browsers get jealous of Chrome's money.) https://github.com/gpuweb/gpuweb/wiki/Minutes-2022-08-10
(DIR) Post #AUQFImag8gMlLebIsi by benmschmidt@vis.social
2023-04-06T17:41:19Z
0 likes, 0 repeats
For projects like deepscatter, moving to webGPU holds huge possibilities going forward because it means we can do arbitrarily complex calculations directly on the GPU, and pass the results back to the browser. Maps like https://atlas.nomic.ai/map/twitter can render 5,000,000 tweets incredibly fast, but push too much compute to CPU--I have a long and growing list of things that are nearly impossible in WebGL (especially the WebGL 1.0 I'm stuck in for deepscatter...) but that should be quite easy in WebGPU.