[HN Gopher] Bumblebee: GPT2, Stable Diffusion, and More in Elixir
___________________________________________________________________
Bumblebee: GPT2, Stable Diffusion, and More in Elixir
Author : clessg
Score : 171 points
Date : 2022-12-08 20:49 UTC (2 hours ago)
(HTM) web link (news.livebook.dev)
(TXT) w3m dump (news.livebook.dev)
| JediLuke wrote:
| This is really amazing, and will make wiring up some of these
| models together much easier I think. Elixir is making great
| strides in being a fantastic choice for ML
| tylrbrkr wrote:
| For sure. The pace at which they've been making all these
| improvement is really impressive.
| [deleted]
| sam5q wrote:
| Huge step for ML in Elixir! Looking forward to using this and
| seeing what others do as well.
| josevalim wrote:
| Hi everyone, glad to be back here with the official announcement!
|
| It is late here but feel free to drop questions and I will answer
| them when I am up. Meanwhile I hope you will enjoy the content we
| put out: the announcements, example apps, and sample notebooks!
| tommica wrote:
| Its really interesting- need to play with this!
| pbowyer wrote:
| Congratulations on this release!
|
| If I have a Huggingface model that I've finetuned, can I load
| it using Bumblebee? I fine-tuned ConvNext and changed it into a
| multi-label classifier and saved it as a PyTorch model. It
| works great but being able to use it in LiveBook instead of
| Jupyter Notebook would be fantastic.
|
| I think I'd have to convert the format, but what then?
| seanmor5 wrote:
| No need to convert the format! Bumblebee supports loading
| parameters directly from PyTorch. You can just specify
| `{:local, path/to/checkpoint}` in `Bumblebee.load_model`
| chem83 wrote:
| Congrats on the launch! Muito legal!
|
| Alternatively to EXLA/Torchx, any thoughts on supporting an ML
| compiler frontend like Google IREE/MLIR by generating StableHLO
| or LinAlg? This could pave the way towards supporting multiple
| hardware targets (Vulkan-based, RVV-based, SME-based etc.) with
| minimal effort from the framework.
| josevalim wrote:
| Yes, we have been eyeing IREE but no work has started yet.
| However, our compiler architecture was designed exactly to
| make this possible, so it will be definitely welcome if
| someone decides to trail ahead on this road!
| seanmor5 wrote:
| There is a project in the ecosystem that builds on top of
| MLIR and can target any MLIR dialect, but it is not mature
| enough yet.
|
| I think it would be great to see!
| losvedir wrote:
| What's the motivation for the name? I get "numbat" to
| "numerical", but not immediately seeing any connection for
| bumblebee.
|
| Maybe a more serious question: is anyone using Elixir ML in
| production? I'm absolutely gobsmacked at the quantity and
| quality of development effort that's gone into it (and use
| Livebook daily, though not for ML stuff). It's clearly a major
| focus for the team. I'm wondering if it's ready for production
| adoption, and if so, if anyone has used it "in anger" yet.
| seanmor5 wrote:
| We are all really into bumblebees, specifically the kind that
| can transform into 1977 Yellow Camaros
| cigrainger wrote:
| We use it in production at Amplified! It's been a joy. We
| train our own large language models and have fine tuned and
| deployed using Elixir. I talked about it at ElixirConf this
| year (https://m.youtube.com/watch?v=Y2Nr4dNu6hI).
|
| We were able to completely eliminate a few python services
| and consolidate to all Elixir for ETL and ML.
| afhammad wrote:
| > Maybe a more serious question: is anyone using Elixir ML in
| production?
|
| Checkout this recent talk about how https://www.amplified.ai/
| moved from Python to an Elixir ML stack:
| https://www.youtube.com/watch?v=Y2Nr4dNu6hI
| AlphaWeaver wrote:
| Congrats on the launch! Been eagerly sitting in the #elixir IRC
| channel all day :)
|
| How easy would it be to support OpenAI's new Whisper
| transcription model in Bumblebee?
| seanmor5 wrote:
| It should not be too hard! From what I saw Whisper is similar
| to Bart, and we have Bart. The missing piece is a library for
| audio processing to tensors.
| tylrbrkr wrote:
| Would Nx Signal be that missing piece?
| https://github.com/polvalente/nx-signal
| seanmor5 wrote:
| For some of it! I was thinking more something for loading
| different audio encodings to Nx tensors...so doing
| everything up until Nx
| thedangler wrote:
| Excited to play around with this.
| pawelduda wrote:
| Awesome release, I might want to try this in my Phoenix project.
| Are there any major disadvantages to doing this vs bringing
| Python ecosystem into the mix, aside from obviously, missing out
| on tooling?
| josevalim wrote:
| I think perhaps the bigger issue is that you will be trailing
| ahead a new path which means fewer resources if you get stuck.
| BUT you can join the #machine-learning channel of the Erlang
| Ecosystem Foundation or ask around in the Elixir Forum!
___________________________________________________________________
(page generated 2022-12-08 23:00 UTC)