[HN Gopher] Julia 1.10
___________________________________________________________________
Julia 1.10
Author : CoreyFieldens
Score : 136 points
Date : 2023-12-27 16:57 UTC (6 hours ago)
(HTM) web link (docs.julialang.org)
(TXT) w3m dump (docs.julialang.org)
| chestertn wrote:
| Im not sure if Julia will ever take off. Right now there are huge
| investments in the AI space and Julia has no presence in those.
| affinepplan wrote:
| julia has already taken off for certain niches
| spenczar5 wrote:
| What is an example? Are there any where it is dominant?
| affinepplan wrote:
| scientific models and simulations can be and are made very
| successfully in julia. In particular the diffeq landscape
| is probably the languages largest comparative advantage
| spenczar5 wrote:
| That's a broad area. I work in astrodynamic simulations,
| and don't know anyone doing much work in Julia. Maybe a
| couple of grad students playing with it, but that's it.
| 99% of the work is Python, Fortran, and C/C++.
|
| Are there subdomains that use it a lot? I am not sure
| what the diffeq landscape is exactly although it sounds
| related to dynamical simulations?
| affinepplan wrote:
| I'm sure there are other subdomains that make use of
| Julia, but in particular I think if your problem involves
| writing an evolution-like or agent-based-model-like
| simulation you may find the strengths of Julia
| particularly compelling
| edgyquant wrote:
| "You may find" is not the same thing as "has taken off
| in" which was the claim
| pjmlp wrote:
| https://juliahub.com/case-studies/
| lagrange77 wrote:
| I recently found this:
|
| https://github.com/JuliaSpace/
|
| Out of curiosity, what Python, Fortran, and C/C++
| packages do you use / can you recommend?
| spenczar5 wrote:
| Astropy [0] lives at the heart of most work. It has a
| Python interface, often backed by Fortran and C++
| extension modules. If you use Astropy, you're indirectly
| using libraries like ERFA [6] and cfitsio [7] which are
| in C/Fortran.
|
| I personally end up doing a lot of work that uses the
| HEALPix sky tesselation, so I use healpy [2] as well.
|
| Openorb is perhaps a good example of a pure-Fortran
| package that I use quite frequently for orbit propagation
| [3].
|
| In C, there's Rebound [4] (for N-body simulations) and
| ASSIST [5] (which extends Rebound to use JPL's pre-
| calculated positions of major perturbers, and expands the
| force model to account for general relativity).
|
| There are many more, these are just ones that come to
| mind from frequent usage in the last few months.
|
| ----
|
| [0] https://www.astropy.org/
|
| [1] https://healpix.jpl.nasa.gov/
|
| [2] https://healpy.readthedocs.io/en/latest/
|
| [3] https://github.com/oorb/oorb
|
| [4] https://rebound.readthedocs.io/en/latest/
|
| [5] https://github.com/matthewholman/assist
|
| [6] https://github.com/liberfa/erfa
|
| [7] https://heasarc.gsfc.nasa.gov/fitsio/
| lagrange77 wrote:
| Thank you very much for the detailed answer!
|
| Will look into those. I recently wrote a little n-body
| simulator to become familiar with Julia's
| DifferentialEquations.jl and that motivated me to learn
| more about astrodynamics.
| chestertn wrote:
| One of the things I don't like about the Julia ecosystem
| is the monolithic libraries that have tons of
| dependencies. DiffEq is one of those. I think its fine to
| write a script but if you want to develop something more
| sophisticated, you want to keep your dependencies lean.
| krull10 wrote:
| You can always (slightly) reduce the DiffEq dependencies
| by adding OrdinaryDiffEq.jl instead of the meta
| DifferentialEquations.jl package. But lots of those
| dependencies arise from supporting modular functionality
| (changing BLAS, linear solvers, Jacobian calculation
| methods, in vs. out of place workflows, etc.). That said,
| the newer extension functionality may let more and more
| of the dependencies get factored out into optional
| extensions as time goes on.
| chestertn wrote:
| I think it's also the design philosophy. JuMP and
| ForwardDiff are great success stories and are packages
| very light on dependencies. I like those.
|
| The DiffEq library seems to pull you towards the SciML
| ecosystem and that might not be agreeable to everyone.
|
| For instance a known Julia project that simulates diff
| equations seems to have implemented their own solver
|
| https://github.com/CliMA/Oceananigans.jl
| ChrisRackauckas wrote:
| That's a bit different, and an interesting different.
| Certain types of partial differential equations like the
| one solved there generally use a form of step splitting
| (i.e. a finite volume method with a staggard grid). Those
| don't map cleanly into standard ODE solvers since you
| generally want to use a different method on one of the
| equations. It does map into the SplitODEProblem form as a
| not DynamicalODEProblem, and so there is a way to
| represent it, but we have not created optimized time
| stepping methods for that. But I work with those folks so
| I understand their needs and we'll be kicking off a new
| project in the MIT Julia Lab in the near future to start
| developing split step and multi-rate methods specifically
| for these kinds of PDEs.
|
| It's an interesting space because:
|
| -(a) there aren't really good benchmarks on the full set
| of options, so a benchmarking paper would be interesting
| to the field (which then gives a motivation to the
| software development)
|
| -(b) none of the implementations I have seen used the
| detailed tricks from standard stiff ODE solvers and so
| there's some major room for performance improvements
|
| -(c) there's some alternative ways to generate the stable
| steppers that haven't been explored, and we have some
| ideas for symbolic-numeric methods that extend the ideas
| of what people have traditionally done by hand here. That
| should.
|
| so we do plan to do things in the future. And having
| Oceananigans is then great because it serves as a speed-
| of-light baseline: if you auto-generate an ocean model,
| do you actually get as fast as a real hand-optimized
| ocean model? That's the goal, and we'll see if we can get
| there.
|
| We have tons of solvers, but you always need more!
| ChrisRackauckas wrote:
| And you can always use the "Simple" versions,
| SimpleDiffEq.jl, SimpleNonlinearSolve.jl. Those libraries
| were made so that users could use exactly the same syntax
| but with dumbed down solvers with essentially zero
| latency. But yes the complete solvers are doing lots of
| fancy things for fancy cases, but SimpleTsit5 is
| recommended in the docs for small cases where you don't
| need all of the extra bells and whistles.
| pjmlp wrote:
| Plenty of examples,
|
| https://juliahub.com/case-studies/
| passion__desire wrote:
| You don't need Julia. Julia was trying to be a better python.
| We will have better python in form of Mojo.
| FridgeSeal wrote:
| Mojo is vapourware from a private company (and we all know
| how those turn out re programming languages) until proven
| otherwise.
| csjh wrote:
| What other private company languages are there? Swift is
| the best example I can think of, which matches Mojo's
| situation down to the head of the project.
| ReleaseCandidat wrote:
| IBM has some, ABAP, ...
| Conscat wrote:
| K is one such language.
| dist-epoch wrote:
| Mathematica
| sinkwool wrote:
| aren't most languages invented and initially developed
| within a private company? Go, Dart, Java, JavaScript, C#,
| F#, VBA, Kotlin, Erlang, C(at AT&T), Rust (at Mozilla).
| The list is probably very long
| notthemessiah wrote:
| Most of those are open source, and while initially
| developed at a private company, most are run by non-
| profits (such as the Rust Foundation for Rust). Also, the
| language itself is not usually the main product at those
| companies.
| markkitti wrote:
| I'm not sure that's a fair description at this point. They
| have distributed some SDKs at this point.
|
| I really do wish it were open source already though. I know
| they have laid out a roadmap for opening the source.
| ubj wrote:
| It should be fairly clear from studying the structure of
| Julia that it was never meant to be simply a better Python.
|
| Mojo also owes part of its design from the lessons it took
| from Julia (as per Chris Lattner [1]).
|
| [1]: https://news.ycombinator.com/item?id=35791125
| markkitti wrote:
| After looking at Mojo, I appreciated all the paradigms that
| Julia was pushing forward even more than I did before.
| Mojo's greatest asset and curse is focusing on being a
| better Python. Julia's greatest asset and curse is trying
| to do a lot more.
| jakobnissen wrote:
| As much as it pains me to say it, I don't think Julia will. It
| looks to me like the practical problems with Julia, while
| addressable, are being addressed too slowly.
|
| There is simply too many rough edges and usability problems as
| it is now, and at the current pace it will take maybe 10 or 15
| years to address them.
|
| On the other hand, the major use case for Julia is to have a
| fast, dynamic language. And it seems to me the time horizon for
| Python to become fast, or Rust or C++ to become dynamic is
| indefinite, so Julia is still the best bet in that space.
| __s wrote:
| What's wrong with JS/TS or Lua as a fast dynamic language?
| jakobnissen wrote:
| Javascript doesn't compile to native code, so isn't as
| fast. I've never tried LuaJIT, though, that's supposed to
| be on par with Julia.
| ReleaseCandidat wrote:
| > Javascript doesn't compile to native code, so isn't as
| fast.
|
| There are AOT compilers for JS.
| jakobnissen wrote:
| So there is for Python, but it's still slow. JS has slow
| semantics. It can't be made in the same performance
| league as Julia.
| markkitti wrote:
| One of my favorite things recently is using JS/TS with
| Julia, usually via a Pluto.jl notebook.
|
| There are also some nice demonstrations showing Julia
| compiled to web assembly. https://tshort.github.io/WebAssem
| blyCompiler.jl/stable/examp...
| edgyquant wrote:
| Python seems to be making rapid strides towards becoming fast
| oivey wrote:
| It's getting faster but not C fast. Its semantics mean it
| basically never will. You'll always need C extensions.
| markkitti wrote:
| Using Cython via pure Python mode seems pretty compelling
| if you all you need is compilation. Compilability is just
| the beginning though.
| oivey wrote:
| Cython is pretty nice, especially for wrappers around C.
| Julia's ccall is honestly even better.
|
| Numba is also really nice, and for just compiling against
| arrays I like it better than Cython. Neither help if you
| need data structures or abstractions, though.
| mardifoufs wrote:
| I don't think so, not the core language at least. In my
| tests I think I saw a 20-30% improvement at best compared
| to versions from 5 years ago. Not saying that that's still
| not a great job from the core maintainers, but I don't
| think Julia has to fear python getting fast. The issue is
| more that python is getting to the point where the ML
| ecosystem has already accepted using a slow language as
| long as everything around it is fast.
|
| Though as I said before, sometimes no amount of c++/c
| escape hatches can improve performance since you have to
| use python objects at some point or another and that will
| be the bottleneck. But by then, you won't needing some of
| the stuff Julia offers like the REPL and notebooks etc.
|
| I haven't used Julia a _lot_ but to me it 's in a weird
| spot where it would be ideal to start projects with in
| theory since you won't need to outgrow the language you are
| starting with, since it's fast enough and has a pretty
| good/maintainable/sane design. But then you are sacrificing
| so much and will need much more time to get started that
| you might not ever get to that point anyways.
|
| Just as an example, debugging obscure problems or deploying
| pytorch models that are more custom in prod is already
| pretty daunting at times, and it's the "best" and most
| popular ML framework in the world. I can't imagine how much
| more time consuming it would be when using a much
| smaller/less used library/framework.
|
| So yeah all of that to say that being way faster isn't how
| Julia will win. Maybe a push from an influent player/big
| tech might give it the momentum it needs.
| ubj wrote:
| Although I mostly agree with this sentiment and have deep
| concerns about Julia's flaws, I will say that Julia has a
| very high ceiling. If the right concerns were addressed in a
| timely manner, I wouldn't be surprised to see its adoption
| rate accelerate.
| ReleaseCandidat wrote:
| It has already taken off and found its not so small niche. I
| don't think it will ever be one of the "big 10" languages (by
| users), but it has already a _big_ user base.
| pjmlp wrote:
| Julia is flying for these folks already,
|
| https://juliahub.com/case-studies/
| chestertn wrote:
| That page is a bit marketing
| pjmlp wrote:
| No different from the usual rewrite in Zig and Rust
| articles on HN, and more industry relevant.
| systems wrote:
| Julia need good a complete database connectivity libraries to
| do better they need to have full support for MS SQL and Oracle
| and other commercial dbs
|
| All my data are in a database, Julia need to become more db
| oriented , that is it
| markkitti wrote:
| Are there solid C interfaces that can be used?
|
| A large part of why I started using Julia is because calling
| into other languages through the C FFI is pretty easy and
| efficient. Most of the wrappers are a single line. If there
| is not existing driver support, I would pass the C headers
| through Clang.jl, which automatically wraps the C API in a C
| header.
|
| https://github.com/JuliaInterop/Clang.jl
|
| I most recently did this with libtiff. Here is the Clang.jl
| code to generate the bindings. It's less than 30 lines of
| sterotypical code.
|
| https://github.com/mkitti/LibTIFF.jl/tree/main/gen
|
| The generated bindings with a few tweaks is here:
|
| https://github.com/mkitti/LibTIFF.jl/blob/main/src/LibTIFF.j.
| ..
| pjmlp wrote:
| Yes, both Oracle (OCI) and SQL Server (ODBC), although they
| are quite a bit low level.
| kloch wrote:
| I love Julia language for the seamless arbitrary precision math
| support and not much else. If we could get that in C I would be
| all set.
| notthemessiah wrote:
| Julia was ahead of the game with automatic differentiation
| which took a few years for Python to get support. And it's
| still ahead of the game with integrating machine learning with
| scientific modeling. Macros and multiple dispatch are a game
| changer, and it allows people to hack and iterate on Julia far
| easier than on Python.
|
| And don't get me started on how nice JuMP.jl is for
| mathematical optimization.
| 6gvONxR4sf7o wrote:
| > Julia was ahead of the game with automatic differentiation
| which took a few years for Python to get support.
|
| In what way is this true? Looks like Julia didn't exist until
| 2012. If I remember correctly, theano was the big AD thing in
| python at that point.
| 7thaccount wrote:
| JuMP is cool, but honestly...I find the native Python APIs
| from CPLEX & GUROBI to be best. The additional abstraction of
| JuMP or the Python equivalent frameworks always is a pain to
| me as I don't need to switch solvers often.
| baldfat wrote:
| I like R and used it two ways. 1) Scheme-like functionalish 2)
| Tiddyverse and found Julia to be a lot of talk but seemed clunky
| to me.
| borodi wrote:
| btw, there has been a pretty nice effort of reimplementing the
| tidyverse in julia with https://github.com/TidierOrg/Tidier.jl
| and it seems to be quite nice to work with, if you were missing
| that from R at least
| LudwigNagasena wrote:
| I used Julia to build a macroeconomic model (DSGE-VAR) during my
| econ studies. I liked the conceptual decisions and the language
| per se (ie as a spec), but DX was quite bad: low discoverability
| of features and proper typings, clunky metaprogramming, long
| compilation times, impossibility of struct redefinitions in REPL.
| My interest died pretty fast because of it.
| huitzitziltzin wrote:
| At the very least Tom Sargent and the New York Fed see it
| differently so you may by the odd man out. If you haven't
| checked out the quant Econ project you are missing a great
| resource for exactly the problems you are working on.
| smabie wrote:
| My experience was exactly the same. This is probably unfair,
| but I got the impression that the people who made Julia never
| actually.. used it? But of course that can't be true so maybe
| my work flow was just significantly different than theirs?
|
| Not a fan of Python at all but now I just stick with that for
| my quant analysis. Tons of issues with Python too but atleast
| they are all known / well documented problems (also chatgpt
| knows pandas / matplotlib / python very well).
| ChrisRackauckas wrote:
| The load time improvements are amazing. Thanks to everyone that
| was involved. I've been using it locally for months now simply
| because of this feature and I had to update my "how to deal with
| compile-time" blog post
| (https://sciml.ai/news/2022/09/21/compile_time/) to basically say
| system images really aren't needed anymore with these
| improvements. With that and the improvements to parallel
| compilation I tend to not care about "first time to X" anymore.
| To me it's solved and I'm onto other things (though I personally
| need to decrease the precompilation time of
| DifferentialEquations.jl, but all of the tools exist in v1.10 and
| that's on me to do, ya'll have done your part!).
|
| Additionally:
|
| * Parser error messages are clearer
|
| * Stack traces are no longer infinitely long! They are good and
| legible!
|
| * VS Code auto-complete stuff is snappier and more predictive
| (might be unrelated, but is a recent improvement in some VS Code
| things)
|
| Altogether, I'm pretty happy with how this one shaped up and am
| looking forward to static compilation and interfaces being the
| next big focus areas.
| rthkljlkrj wrote:
| For those wondering what Chris is talking about, I just tried
| this: using Plots plot(sin)
|
| from fresh start, and it's about 2 seconds on my Dell Latitude
| 7400 (Core i7).
| teruakohatu wrote:
| Julia 1.10 takes 1.00 seconds on my laptop, including loading
| Julia itself:
|
| time julia -e "using Plots; plot(sin)"
| pjmlp wrote:
| That is the thing, many things only manage to succeed by having
| the patience to wait for the outcome from incremental
| improvements.
|
| Looking forward to update my Julia installation.
| frakt0x90 wrote:
| Just wanted to say I appreciate all the Julia content and
| evangelism you put out. It's kept me excited for the language
| and is a big reason I still use it for most of my personal
| work.
| singularity2001 wrote:
| are my main concerns resolved:
|
| * Hello World 200 MB ?
|
| * discoverability of functions: object.fun<tab>
| => fun(object) in REPL / IDE?
|
| object.<tab> => List of applicable functions?
| ChrisRackauckas wrote:
| > discoverability of functions
|
| I think for the most part this is solved. It works very well
| and has good integration with VS Code: julia>
| integrator.<tab> EEst accept_step
| alg cache
| callback_cache differential_vars do_error_check
| dt dtacc ...
|
| etc. cut short. julia> ODEProblem(<tab>
| ODEProblem(f::SciMLBase.AbstractODEFunction, u0, tspan,
| args...; kwargs...) @ SciMLBase C:\Users\accou\.julia\dev\SciML
| Base\src\problems\ode_problems.jl:183
| ODEProblem(sys::ModelingToolkit.AbstractODESystem, args...;
| kwargs...) @ ModelingToolkit C:\Users\accou\.julia\packages\Mod
| elingToolkit\arrCl\src\systems\diffeqs\abstractodesystem.jl:911
| ODEProblem(f, u0, tspan; ...) @ SciMLBase C:\Users\accou\.julia
| \dev\SciMLBase\src\problems\ode_problems.jl:187
| ODEProblem(f, u0, tspan, p; kwargs...) @ SciMLBase C:\Users\acc
| ou\.julia\dev\SciMLBase\src\problems\ode_problems.jl:187
|
| > Hello World 200 MB
|
| Not quite. There's a bunch of knobs you can use to get small
| binaries (I use this for industrial deployments often), but
| Jeff Bezanson gave a really nice talk at JuliaCon Local
| Eindhoven 2023 that described the reasons for the large
| binaries, what the memory is actually attributed to, and what
| to do about it
| (https://youtu.be/kNslvU3WD4M?si=hwo9AgXthNpiQ3-P). With the
| "normal options" you get to about 15MB now, still bad but not
| half as bad. The vast majority of that is the base system
| image. Jeff's talk then goes into the next steps with reducing
| the size of that base system image.
| markkitti wrote:
| How about a 16K Hello World? julia> using
| StaticTools, StaticCompiler julia> hello_world() =
| printf(c"Hello World\n") hello_world (generic function
| with 1 method) julia>
| compile_executable(hello_world, (), "./") "~/hello_world"
| shell> ./hello_world Hello World shell> du
| -h hello_world 16K hello_world
|
| See https://github.com/brenhinkeller/StaticTools.jl for further
| details and limitations.
|
| For discovering methods you can do julia>
| ?("hello", 1, 2.0)[TAB] broadcast(f, x::Number...) @
| Base.Broadcast broadcast.jl:844
| readuntil(filename::AbstractString, args...; kw...) @ Base
| io.jl:520 ...
|
| See the Tab Completion section of the REPL documentation,
| https://docs.julialang.org/en/v1/stdlib/REPL/#Tab-completion .
| jszymborski wrote:
| Out of curiosity, how's the state of DL for Julia.
|
| Can I use PyTorch or JAX comfortably in Julia?
| teruakohatu wrote:
| Flux is quite a nice lower level library:
|
| https://github.com/FluxML/Flux.jl
|
| On top of that there are many higher level libraries such as
| Transformers.jl
|
| https://github.com/chengchingwen/Transformers.jl
| markkitti wrote:
| There is https://github.com/FluxML/Torch.jl. There are also
| Julia native frameworks such as FluxML, https://fluxml.ai/ .
| ChrisRackauckas wrote:
| Lux.jl does a really good job at being clear with syntax and
| hackable. I couldn't recommend it more.
| https://lux.csail.mit.edu/. Here's good materials to start
| with: https://lux.csail.mit.edu/dev/tutorials/beginner/1_Basics
| Tarrosion wrote:
| What's the user-facing difference between Lux and Flux?
| darsnack wrote:
| How you interact with parameters.
|
| Lux is similar to Flax (Jax) where the parameters are kept
| in a separate variable from the model definition, and they
| are passed in on the forward pass. Notably, this design
| choice allows Lux to accept parameters built with
| ComponentArrays.jl which can be especially helpful when
| working with libraries that expect flat vectors of
| parameters.
|
| Flux lies somewhere between Jax and PyTorch. Like PyTorch,
| the parameters are stored as part of the model. Unlike
| traditional PyTorch, Flux has "functional" conventions,
| e.g. `g = gradient(loss, model)` vs. `loss.backward()`.
| Similar to Flax, the model is a tree of parameters.
| staunton wrote:
| > Can I use PyTorch or JAX comfortably in Julia?
|
| No. And it doesn't seem like that will become possible any time
| soon.
| darsnack wrote:
| > Can I use PyTorch or JAX comfortably in Julia?
|
| There is https://github.com/rejuvyesh/PyCallChainRules.jl which
| makes this possible. But using some of the native Julia ML
| libraries that others have mentioned is preferable.
| NeuroCoder wrote:
| Just realized that these release notes are missing added support
| for public use of atomic pointer ops. Julia has had support for
| atomic operations for a while in various forms but now users can
| use `unsafe_load`, `unsafe_store!`, `unsafe_swap!`,
| `unsafe_replace!`, and `unsafe_modify!` with an `order` argument.
___________________________________________________________________
(page generated 2023-12-27 23:01 UTC)