[HN Gopher] JIT/GPU accelerated deep learning for Elixir with Ax...
___________________________________________________________________
JIT/GPU accelerated deep learning for Elixir with Axon v0.1
Author : ahamez
Score : 79 points
Date : 2022-06-16 12:52 UTC (10 hours ago)
(HTM) web link (seanmoriarity.com)
(TXT) w3m dump (seanmoriarity.com)
| m00dy wrote:
| how gpu acceleration works under the hood ? There is no such info
| in the blogpost.
| throwawaymaths wrote:
| There's a library called Nx which abstracts elixir code into
| backend engines like xla. You could write your own backend too
| if you wanted.
|
| Edit: xla, not jax
| josevalim wrote:
| It uses [Nx](https://github.com/elixir-
| nx/nx/tree/main/nx#readme) (think of it as an equivalent to
| numpy/jax) which uses Google XLA's as one of the possible
| backends to compile and run on the GPU.
| mountainriver wrote:
| The ONNX runtime seems to be the only really valuable part.
|
| Libraries like this pop up in all kinds of languages to only
| slowly die off due to lack of community. Data scientists and ML
| engineers aren't going to be writing in Elixer any time soon.
| Maybe loading a model into a web server though.
| josevalim wrote:
| > The ONNX runtime seems to be the only really valuable part.
|
| I actually disagree. :)
|
| If all we had was bindings for ONNX, then we would have to tie
| ourselves to a single format (and potentially a single
| runtime). However, having ONNX built on top of a solid
| foundation (Axon), we get more flexibility and solidity to
| handle the moving ML landscape.
|
| As an example, I would actually prefer to load pre-trained
| models from Hugging Face directly into Axon and bypass ONNX
| altogether. The most immediate benefit for an Elixir developer
| is that they don't need Python on their machine for the ONNX
| conversion. But it also means that I can slightly tweak pre-
| trained models too. And that's another win.
|
| Perhaps your point is that nobody will train a neural network
| from scratch in Elixir, and maybe that's true, but I would be
| wary of looking at ONNX as the only possible outcome of the
| work done so far.
|
| Indeed there is an associated high risk to new efforts like
| this but that's often part of creating something new. I have
| heard countless times that there is no value in creating a new
| programming language, until the value was suddenly there.
| marcus_cemes wrote:
| Unfortunately, I have to agree. I've been using Elixir for the
| last year and it's one of the _very_ few languages that I 've
| really enjoyed using, a lot. I would love to see it get
| traction, but it doesn't have the same momentum as Python, nor
| Rust for that matter.
|
| I don't think there's anything special about Python and data
| science/ML, it just happens to be everyone's go-to wrapper
| language around C code. Personaly, I dislike Python, it has
| been nothing bit pain for me in the past (not necessarily the
| language, but the design of popular libraries such as
| matplotlib), but it's always handy when you need to get
| something done.
|
| That being said, there are a lot of really high-quality Elixir
| projects being developed at the moment, Liveview, Livebook
| (that I think far exceeds Jupyter in innovation), Nx, etc.
| Perhaps that's one of the benefits of having a smaller
| community. I like how Elixir has slowly developed into a mature
| and beautifuly designed language, with focus on stability and
| not hype.
| ch4s3 wrote:
| I think when LiveBook desktop[1] is ready, it's going to
| become a lot more compelling.
|
| [1]https://github.com/livebook-dev/livebook#desktop-app
| shiryel wrote:
| Yea, only time will tell, but one thing that I love about
| Elixir is it community, that try to support and improve the
| "big projects" instead of remaking, eg:
|
| - Phoenix for web dev
|
| - Nerves for IoT / Embedded systems
|
| - Membrane for multimedia / streaming
|
| Hopefully Nx and Axon will be in this list soon :)
| nestorD wrote:
| > Libraries like this pop up in all kinds of languages to only
| slowly die off due to lack of community.
|
| I do not agree on the, very specific, subject of deep-learning.
| Libraries pop up and then die because they lack competitive
| performances (often, they don't even have GPU capabilities) or
| are too restrained due to the methods they used to stay
| performant (restricting the shape of the neural network).
|
| This particular library builds on XLA (the compiler behind JAX)
| which should give it competitive performance and a flexible
| enough interface. Once you have that, having a large community
| is not that important (most deep learning models are built of
| fairly simple building blocks that are easy to port if need
| be).
| davydog187 wrote:
| I'm so impressed by how quickly Sean and the ML working group
| have made progress on Nx, Axon, etc. Great job, excited to see
| where this goes!
| ch4s3 wrote:
| Yeah, the ability to import external models is great!
___________________________________________________________________
(page generated 2022-06-16 23:01 UTC)