[HN Gopher] Julia and JuliaHub: Advancing Innovation and Growth
       ___________________________________________________________________
        
       Julia and JuliaHub: Advancing Innovation and Growth
        
       Author : xgdgsc
       Score  : 119 points
       Date   : 2025-02-03 07:19 UTC (2 days ago)
        
 (HTM) web link (info.juliahub.com)
 (TXT) w3m dump (info.juliahub.com)
        
       | pjmlp wrote:
       | I love to see Julia growth, if nothing else by being another
       | Dylan like take on Lisp ideas, with a JIT compiler in the box,
       | and the community keeping the effort to overcome tooling issues
       | despite critics.
        
         | tajd wrote:
         | Yeah it's interesting to see how it's getting on! I wrote my
         | PhD simulation code in it from the ground up as it had nice
         | fundamental abstractions for parallizable code. Of course now
         | it's just Python and Scala/Java but Julia was great for my
         | purpose.
        
       | jakobnissen wrote:
       | It would be much more useful to see metrics that aren't
       | cumulative if we're interested in growth. Cumulative
       | measurements, by definition, will never decrease, even if Julia
       | were to fall in popularity.
        
         | tpoacher wrote:
         | indeed; something like an h5-index would be interesting to see.
        
       | NeutralForest wrote:
       | I like the language but I can't help but feel it missed the train
       | and that the ergonomics improvements it offers are too small to
       | switch over from Python.
        
         | pjmlp wrote:
         | Depends on which train Julia folks want to board into.
        
           | NeutralForest wrote:
           | It felt to me like they wanted to be the language for ML/DL,
           | which they haven't achieved. They clearly have been working
           | more towards scientific stuff + ML, all the differential
           | equations and math packages are a testament to that (as well
           | as the pharma stuff with Puma).
           | 
           | I'm not aware of what the vision is currently tbh
        
             | affinepplan wrote:
             | I think one really good use case is complex simulations.
        
             | mbauman wrote:
             | The key for me -- as someone who has been around for a long
             | time and is at JuliaHub -- is that Julia excels most at
             | problems that _don 't_ already have an efficient library
             | implementation.
             | 
             | If your work is well-served by existing libraries, great!
             | There's no need to compete against something that's already
             | working well. But that's frequently not the case for
             | modeling, simulation, differential equations, and SciML.
        
               | catgary wrote:
               | The ODEs stuff in Julia is nice, but I think
               | diffusers/JAX is a reasonable backbone to copy over
               | whatever you need from there. I do think Julia is doing
               | will in stats and has gotten some mindshare from R in
               | that regard.
               | 
               | But I think a reasonably compentent Python/JAX programmer
               | can roll out whatever they need relatively easily
               | (especially if you want to use the GPU). I do miss
               | Tullio, though.
        
               | lagrange77 wrote:
               | > But I think a reasonably compentent Python/JAX
               | programmer can roll out whatever they need relatively
               | easily
               | 
               | You mean in terms of the ODE stuff, Julia provides?
        
               | catgary wrote:
               | Diffusers is pretty well done (I think the author was
               | basically rewriting some Julia libraries and adapting
               | them to JAX). I can't imagine it being too hard to adapt
               | most SciML ODE solvers.
               | 
               | For simulations, JAX will choke on very "branchy"
               | computations. But, honestly I've had very little success
               | differentiating through those computations in the first
               | place and they don't run well on the GPU. Thus, I'm
               | generally inclined to use wrappers around C++ (or ideally
               | Rust) for those purposes (my use-case is usually some
               | rigid-body dynamics style simulation).
        
         | dv_dt wrote:
         | It does feel like julia will not make the leap to displace
         | python, but for a long time python offered too few improvements
         | over perl, so its not completely out of the question.
        
       | joshlk wrote:
       | According to Stackoverflow trends, Julia's popularity is
       | decreasing and very small
       | 
       | https://trends.stackoverflow.co/?tags=julia
        
         | mjgant wrote:
         | Or thats the LLM/ChatGPT effect. Can see similar downtrends
         | with other languages
        
         | NeutralForest wrote:
         | Even languages like Python and Javascript who are huge show a
         | decline after 2022 which suggests ChatGPT is probably
         | responsible. It would be better to have some other measure imo.
        
           | joshlk wrote:
           | It measures the proportion of questions for that language out
           | of all languages. So, if there is a general decline in
           | Stackoverflow questions, it's already accounted for in the
           | metric
        
             | NeutralForest wrote:
             | There are too many confounding factors still.
        
         | amval wrote:
         | That's mostly because Julia questions get answered on its
         | Discourse or Slack. The sharp decline is due to an automatic
         | cross-post bot that stopped working.
         | 
         | No one bothered fixing it, in great part due to Discourse being
         | the main place of discussion, as far as I know.
        
         | eigenspace wrote:
         | Julia users don't go to Stack Overflow because we have better
         | options.
        
         | veqq wrote:
         | Stackoverflow's popularity's decreased a lot, many communities
         | have entirely left.
        
       | jarbus wrote:
       | I've used, and am still using, Julia for my PhD research. It's
       | perfect for parallel/distributed computing, and the neural
       | network primitives are more than enough for my purposes. Anything
       | I write in pure julia runs really, really fast, and has great
       | profiling tools to improve performance further.
       | 
       | Julia also integrates with python, with stuff like PythonCall.jl.
       | I've gotten everything to work so far, but it hasn't been smooth.
       | The python code is always the major bottleneck though, so I try
       | to avoid it.
       | 
       | Overall, julia is a significantly better language in every single
       | aspect except for ecosystem and the occassional environment
       | issue, which you'll get with conda often anyways. It's really a
       | shame that practically nobody actually cares about it compared to
       | python. It supports multi-dimensional arrays as a first-class
       | citizen, which means that each package doesn't have it's own
       | array like torch, numpy, etc, and you don't have to constantly
       | convert between the types.
        
         | frakt0x90 wrote:
         | I agree on all points. I have used Python for 15 years, Julia
         | for 3 and reach for Julia most of the time for personal
         | projects. I was really stoked when at work the only FOSS solver
         | for our problem was in Julia so we wrote the rest in it for
         | easy integration. The only thing I dread is having to look for
         | a new package since the ecosystem can be quite fragmented.
        
         | Bostonian wrote:
         | Your last sentence applies equally to Fortran. How would you
         | compare Julia and Fortran?
        
           | realo wrote:
           | Julia uses LLVM for its jit architecture, if I recall
           | correctly.
           | 
           | That makes it a good candidate for running well on ARM
           | platforms (think embedded data processing at the edge).
           | 
           | Not sure how well fortran does on ARM.
        
             | pjmlp wrote:
             | Fortran does quite well on almost any major CPU since
             | 1950's, including GPUs.
             | 
             | Actually one of the reasons CUDA won the hearts of
             | researchers over OpenCL, is that Khronos never cared for
             | Fortran, and even C++ was late to the party.
             | 
             | I attended one Khronos webminar where the panel was puzzled
             | with a question from the audience regarding Fortran support
             | roadmap.
             | 
             | NVidia is sponsoring the work on the LLVM Fortran frontend,
             | so same applies.
             | 
             | https://flang.llvm.org/docs/
        
               | pklausler wrote:
               | "sponsoring" in this case means writing nearly all of it
               | ourselves (although we've had lots of help from Arm and
               | some others on specific areas like OpenMP).
        
               | pjmlp wrote:
               | I see, I do follow LLVM conference talks, but not that
               | deep.
        
           | SatvikBeri wrote:
           | Julia is generally higher level than Fortran, with syntax
           | inspired by Python/R/Matlab. We've been able to reliably hire
           | Math PhDs and quickly get them productive in Julia, which
           | would take much longer with Fortran.
        
           | eigenspace wrote:
           | Julia is dynamically typed, has a very rich type system,
           | powerful metaprogramming and polymorphism tools.
           | 
           | Julia also has an active thriving ecosystem, and an excellent
           | package manager.
        
           | ted_dunning wrote:
           | Julia adds some pretty amazing stuff with multiple dispatch
           | and run-time compilation. What this means is that you can
           | glue code together in ways impossible for other languages.
           | 
           | One example is a system that I built using three libraries.
           | One was a C library from Postgres for geolocation, another
           | was Uber's H3 library (also C) and a third was a Julia native
           | library for geodesy. From Julia, I was able to extend the API
           | of the H3 library and the Postgres library so that all three
           | libraries would inter-operate transparently. This extension
           | could be done _without_ any mods to the packages I was
           | important.
           | 
           | Slightly similar, if you have a magic whizbang way of looking
           | at your data as a strange form of matrix, you can simply
           | implement a few optimized primitive matrix operations and the
           | standard linear algebra libraries will now use _your_ data
           | structure. Normal languages can 't really do that.
           | 
           | More on that second case and the implications in the
           | following video:
           | 
           | https://www.youtube.com/watch?v=kc9HwsxE1OY
        
         | SirHumphrey wrote:
         | I actually find that ecosystem for Julia is not that big of an
         | issue for me. I guess that my specific use-case (data analysis,
         | numerical simulations) is probably the most developed part of
         | the ecosystem, but regarding that I find that the ecosystem is
         | much more homogeneous than for example python - most things
         | work with most other things (eg units or measurement
         | uncertainties libraries work automatically with a piloting
         | library).
        
       | tmvphil wrote:
       | As someone working with it day to day, coming from around 18
       | years of mostly python, I wish I could say my experience has been
       | great. I find myself constantly battling with the JIT and
       | compilation and recompilation and waiting around all the time
       | (sometimes 10 to 15 minutes for some large projects). Widespread
       | macro usage makes stack traces much harder to read. Lack of
       | formal interfaces means a lot of static checking is not
       | practical. Pkg.jl is also not great, version compatibility is
       | kind of tacked on and has odd behavior.
       | 
       | Obviously there are real bright spots too, with speed, multiple
       | dispatch, a relatively flourishing ecosystem, but overall I
       | wouldn't pick it up for something new if given the choice. I'd
       | use Jax or C++ extensions for performance and settle on python
       | for high level, despite its obvious warts.
        
         | catgary wrote:
         | Yeah, Jax with Equinox, jaxtyping, and leaning hard on python's
         | static typing modules + typeguard lets you pretend that you
         | have a nice little language embedded in python. I swore off
         | Julia a few years ago.
        
       | 6gvONxR4sf7o wrote:
       | I do scientific computing and a lisp was one of my first
       | languages, so i feel like i ought to be the target audience, but
       | it just never quite catches me.
       | 
       | It's almost statically compilable which has almost gotten me to
       | pick it up a few times, but apparently it still can't compile a
       | lot of the most important ecosystem packages yet.
       | 
       | The metaprogramming has almost gotten me to pick it up a few
       | times, but apparently there aren't mature static anti-footgun
       | tools, even to the degree of mypy's pseudo-static analysis, so I
       | wouldn't really want to use those in prod or even complex toy
       | stuff.
       | 
       | It's so damned interesting though. I hope it gets some of this
       | eventually.
        
       | kayson wrote:
       | I'm curious how people feel about the JIT compilation time vs
       | runtime tradeoff these days. Any good recent benchmarks?
        
         | affinepplan wrote:
         | Chapel folk did a really nice benchmark last year including
         | Julia, where it landed pretty much right on the Pareto frontier
         | of code size vs performance
         | 
         | I know that's not exactly answering your question, but you
         | might be interested https://chapel-
         | lang.org/ChapelCon/2024/chamberlain-clbg.pdf
        
           | bradcray wrote:
           | The ~10-minute video for this talk is here, if anyone's
           | interested in the narrative behind the slides:
           | https://www.youtube.com/watch?v=U8KM8wv32js
        
       | cbruns wrote:
       | I am a MATLAB and Python user who has flirted with julia as a
       | replacement. I don't love the business model of JuliaHub, which
       | feels very similar to Mathworks in that all the cool toolboxes
       | are gated behind a 'contact sales' or high priced license. The
       | free 20 hours of cloud usage is a non-starter. Also it seems that
       | by default, all JuliaHub usage is default cloud-based? on-prem
       | and airgapped (something I need) is implied to be $$$.
       | 
       | Open sourcing and maintaining some components of things like
       | JuliaSim or JuliaSim Control might expand adoption of Julia for
       | people like me. I will never be able to convince my company to
       | pay for JuliaHub if their pricing is similar to Mathworks.
        
       | toolslive wrote:
       | We do statistical modeling in Python in our company. When a
       | statistician asked for R, I said "no, but you can have Julia".
       | He's quite happy with it, and we're planning to move some stuff
       | over.
        
       | Kalanos wrote:
       | With some serious repositioning, I think there is still an
       | opportunity for Julia to displace Python tools like
       | polars/pandas/numpy, airflow, and pytorch -- with a unified
       | ecosystem that makes it easy to transition to GPU and lead a
       | differentiable programming revolution. They have the brain power
       | to do it.
       | 
       | The future of Python's main open source data science ecosystem,
       | numfocus, does not seem bright. Despite performance improvements,
       | Python will always be a glue language. Python succeeds because
       | the language and its tools are * _EASY TO USE*_. It has nothing
       | to do with computer science sophistication or academic prowess -
       | it humbly gets the job done and responds to feedback.
       | 
       | In comparison to mojo/max/modular, the julia community doesn't
       | seem to be concerned with capturing share from python or picking
       | off its use cases. That's the real problem. There is room for
       | more than one winner here. However, have the people that wanted
       | to give julia a shot already done so? I hope not because there is
       | so much richness to their community under the hood.
        
         | catgary wrote:
         | Julia has really lost the differentiable programming mindshare
         | to JAX. I've spent weeks or months getting tricky gradients to
         | work in Julia, only to have everything "just work" in JAX. The
         | quality of the autograd is night and day, and goes down to the
         | basic design decisions of the respective "languages" (in the
         | sense that JAX jit compiles a subset of Python) and their
         | intermediate representations.
         | 
         | Fundamentally, when you keep a tight, purely functional core
         | representation of your language (e.g. jaxpr's) and decompose
         | your autograd into two steps (forward mode and a compiler-level
         | transpose operation) you get a system that is _substantially_
         | easier to guarantee correct gradients, is much more composable,
         | and even makes it easier to define custom gradients.
         | 
         | Unfortunately, Julia didn't actually have any proper PLT or
         | compilers people involved in the outset. This is the original
         | sin I see as someone with an interest in autograd. I'm sure
         | someone more focused on type theory has a more cogent criticism
         | of their design decisions in that domain and would identify a
         | different "original sin".
         | 
         | In the end, I think they've made a nice MatLab alternative but
         | there's a hard upper bound on what they can reach.
        
           | affinepplan wrote:
           | > Julia didn't actually have any proper PLT or compilers
           | people involved in the outset.
           | 
           | while I don't disagree that currently JAX outshines Julia's
           | autodiff options in many ways, I think comments like this are
           | 1. false 2. rude and 3. unnecessary to make your point
        
             | catgary wrote:
             | Julia was a scientific computing language made by
             | scientific computing experts. They did a great job on some
             | things, but whiffed a few major decisions early on.
        
               | affinepplan wrote:
               | It's a general purpose language made by experts in a
               | myriad of subjects.
        
               | catgary wrote:
               | I'm sorry, but I'm going to disagree with you on that.
               | Can you point to any of the language designers who had a
               | background in programming language theory? The closest
               | thing I see is Bezanson's work on technical computing,
               | which seems laser-focused on array programming. I don't
               | really see anything related to types or program
               | transformations.
        
         | tomnicholas1 wrote:
         | > The future of Python's main open source data science
         | ecosystem, numfocus, does not seem bright. Despite performance
         | improvements, Python will always be a glue language.
         | 
         | Your first sentence is a scorching hot take, but I don't see
         | how it's justified by your second sentence.
         | 
         | The community always understood that python is a glue language,
         | which is why the bottleneck interfaces (with IO or between
         | array types) are implemented in lower-level languages or ABIs.
         | The former was originally C but often is now Rust, and Apache
         | Arrow is a great example of the latter.
         | 
         | The strength of using Python is when you want to do anything
         | beyond pure computation (e.g. networking) the rest of the world
         | already built a package for that.
        
       ___________________________________________________________________
       (page generated 2025-02-05 23:01 UTC)