[HN Gopher] Nx: Multi-dimensional tensors Elixir lib with multi-...
       ___________________________________________________________________
        
       Nx: Multi-dimensional tensors Elixir lib with multi-staged
       compilation (CPU/GPU)
        
       Author : thibaut_barrere
       Score  : 125 points
       Date   : 2021-02-17 16:37 UTC (4 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | themgt wrote:
       | What's really amazing to me is Nx is just installed as normal hex
       | package. `defn` is a standard Elixir macro:
       | https://github.com/elixir-nx/nx/blob/main/nx/lib/nx/defn.ex#...
        
       | lawn wrote:
       | Elixir as a language is really approachable. If you're looking to
       | dive in I recommend the book `Elixir in Action` but the official
       | docs are also quite good.
        
       | 0b01 wrote:
       | I don't find this useful in any way since Elixir is weakly typed
       | and a worse Lisp than Python..
        
         | lawn wrote:
         | How exactly is Elixir "a worse Lisp than Python"?
         | 
         | Elixir is a functional language and it even has Lisp-style
         | macros.
        
           | 0b01 wrote:
           | deleted
        
             | pmarreck wrote:
             | douches get downvotes
             | 
             | https://games.greggman.com/game/dynamic-typing-static-
             | typing...
             | 
             | The hard evidence that static typing is an overall-win is
             | lacking. Period.
             | 
             | or, more succinctly,
             | 
             | https://i.imgur.com/zbU1wwH.jpg
        
               | [deleted]
        
         | dudul wrote:
         | Elixir is not weakly typed, it has strong typing on the
         | contrary. The word you're looking for is "dynamic". Javascript
         | and PHP are weakly typed, neither Erlang nor Elixir.
        
           | conradfr wrote:
           | PHP is weakly typed but it has a strict type option nowadays,
           | and class properties can also be typed.
           | 
           | Combined with function parameters typehinting (which easily
           | beats typespecs) it's a great and productive developer
           | experience I must say.
        
           | [deleted]
        
       | thibaut_barrere wrote:
       | Also just above (https://github.com/elixir-nx/nx/tree/main/exla),
       | exla is an "Elixir client for Google's XLA (Accelerated Linear
       | Algebra). It includes integration with the Nx library to compile
       | numerical definitions (defn) to the CPU/GPU."
        
       | dnautics wrote:
       | The Nx architecture is quite good. I started work on a backend
       | that avoids requiring jvm/bazel/python toolchain (substituting in
       | a zig toolchain):
       | 
       | https://github.com/ityonemo/ez
       | 
       | the two languages feel like they were made for each other.
       | Despite the verbosity of doing some things in zig (the dynamic
       | tensor broadcast function is a bit of a beast), due to both
       | language's similar compile-time metaprogramming facilities,
       | getting to the point of plugging in Nx's tensor addition + tensor
       | multiplication is about 200 LoC.
        
         | jeffreysmith wrote:
         | Cool project. I'd never heard of Zig, but it looks like an
         | interesting solution to the need you've identified.
         | 
         | I'd definitely agree that it seems valuable to have an Nx
         | version that comes with a smaller, simpler install for
         | development purposes. XLA is amazing, but it's not obvious that
         | you want to have that toolchain setup for local development in
         | all scenarios.
         | 
         | Would be great to see several different backends created for Nx
         | as a way of exploring what feature sets are really needed and
         | ensuring that the integration mechanisms are sufficiently
         | simple and general to support multiple solutions.
        
           | dnautics wrote:
           | Well another place is for elixir nerves. You probably don't
           | want to figure out how to shipping the xla architecture to
           | your embedded device (I think it has to JIT the code); with
           | nerves you can trivially cross-compile the nif shared library
           | with zigler and deploy to arm architecture.
           | 
           | But kudos to Jose for building an architecture at a very good
           | level of abstraction, you know, just in case xla falls out of
           | favor (it _is_ a Google project), nx won 't have a hard
           | dependency, and there will be trivial migration paths out.
           | 
           | I think there could be some other interesting nx backends,
           | like "remote computation", or "transpile elixir and run on a
           | Julia worker node"
        
       | josevalim wrote:
       | Hi everyone, co-creator of Nx here!
       | 
       | I have just published an article on Dashbit's blog with a bit
       | more context on Nx and the design decisions behind it:
       | https://dashbit.co/blog/nx-numerical-elixir-is-now-publicly-...
       | 
       | It is hopefully a more in-depth reference than the README. If you
       | have any questions, I will be glad to answer them!
        
         | rdtsc wrote:
         | Just wanted to say thank you, and how much I appreciate your
         | work for the Elixir and Erlang communities.
         | 
         | A common complaint with Elixir and Erlang is that it isn't
         | suitable for numerical computations, well now it is!
        
         | nnadams wrote:
         | I'm excited to try this out, especially after seeing the XLA
         | support.
         | 
         | Where do you see Nx going in the future? I saw there's an MNIST
         | example in the repo. Do you plan including higher-level
         | features for neural networks, similar to Keras/PyTorch?
        
           | josevalim wrote:
           | Sean is looking into higher-level features for NN - it will
           | likely be a separate library on top of Nx. For general
           | direction, I briefly discuss that at the end of the article
           | above. :) If something is unclear, please let me know!
        
       | losvedir wrote:
       | Jose is so prolific on such varied kinds of projects. I expect
       | he'll team up with John Carmack next to solve General AI...
       | 
       | I know the newest BEAM release has a new JIT compiler for faster
       | calculations. Is that work related to this at all? I'm guessing
       | not, since it seems like the speed up here is from moving the
       | calculation out of the BEAM.
        
         | dnautics wrote:
         | it's not related to the new jit.
        
       | 1_player wrote:
       | As a regular full stack developer writing Elixir full time, what
       | is the use of such a library? I have heard of Tensorflow, there
       | is mention of tensors in this article, but what would be the
       | practical use of such a library, in the real world? When would
       | one write code like the one shown in the examples?
       | 
       | I'm happy that Elixir is becoming more and more mainstream. It's
       | the best language I've used in my 15 years in the business.
        
       | vfclists wrote:
       | I learned something new, a creature called the Numbat.
       | 
       | I thought it was a pun on "wombat"
        
       | fouric wrote:
       | As a programmer with no ML/GPU experience, this reminds me of the
       | Spiral language: https://github.com/mrakgr/The-Spiral-Language
       | 
       | I wonder if Nx will succeed due being "just" a library for a
       | popular language (Elixir) instead of being a separate language
       | entirely.
       | 
       | "Don't make a new language if your idea could be implemented as a
       | library" is good advice indeed.
        
       | cultofmetatron wrote:
       | As if I needed another reason to just stay in elixir fulltime.
       | Machine learning/deeplearning was one of the few niches I was
       | looking outside elixir for.
        
       | ajtulloch wrote:
       | A bit petty, but the first example is an unstable softmax
       | implementation:                 defn softmax(t) do
       | Nx.exp(t) / Nx.sum(Nx.exp(t))       end
       | 
       | See
       | https://ogunlao.github.io/2020/04/26/you_dont_really_know_so...
       | etc.
        
         | josevalim wrote:
         | Thanks for the info and the reference, that was a good read!
        
       ___________________________________________________________________
       (page generated 2021-02-17 21:00 UTC)