Julia and JuliaHub: Advancing Innovation and Growth

(info.juliahub.com)

113 points | by xgdgsc 2 days ago

12 comments

  • jarbus 5 hours ago
    I've used, and am still using, Julia for my PhD research. It's perfect for parallel/distributed computing, and the neural network primitives are more than enough for my purposes. Anything I write in pure julia runs really, really fast, and has great profiling tools to improve performance further.

    Julia also integrates with python, with stuff like PythonCall.jl. I've gotten everything to work so far, but it hasn't been smooth. The python code is always the major bottleneck though, so I try to avoid it.

    Overall, julia is a significantly better language in every single aspect except for ecosystem and the occassional environment issue, which you'll get with conda often anyways. It's really a shame that practically nobody actually cares about it compared to python. It supports multi-dimensional arrays as a first-class citizen, which means that each package doesn't have it's own array like torch, numpy, etc, and you don't have to constantly convert between the types.

    • frakt0x90 4 hours ago
      I agree on all points. I have used Python for 15 years, Julia for 3 and reach for Julia most of the time for personal projects. I was really stoked when at work the only FOSS solver for our problem was in Julia so we wrote the rest in it for easy integration. The only thing I dread is having to look for a new package since the ecosystem can be quite fragmented.
    • SirHumphrey 2 hours ago
      I actually find that ecosystem for Julia is not that big of an issue for me. I guess that my specific use-case (data analysis, numerical simulations) is probably the most developed part of the ecosystem, but regarding that I find that the ecosystem is much more homogeneous than for example python - most things work with most other things (eg units or measurement uncertainties libraries work automatically with a piloting library).
    • Bostonian 4 hours ago
      Your last sentence applies equally to Fortran. How would you compare Julia and Fortran?
      • realo 3 hours ago
        Julia uses LLVM for its jit architecture, if I recall correctly.

        That makes it a good candidate for running well on ARM platforms (think embedded data processing at the edge).

        Not sure how well fortran does on ARM.

        • pjmlp 2 hours ago
          Fortran does quite well on almost any major CPU since 1950's, including GPUs.

          Actually one of the reasons CUDA won the hearts of researchers over OpenCL, is that Khronos never cared for Fortran, and even C++ was late to the party.

          I attended one Khronos webminar where the panel was puzzled with a question from the audience regarding Fortran support roadmap.

          NVidia is sponsoring the work on the LLVM Fortran frontend, so same applies.

          https://flang.llvm.org/docs/

      • eigenspace 2 hours ago
        Julia is dynamically typed, has a very rich type system, powerful metaprogramming and polymorphism tools.

        Julia also has an active thriving ecosystem, and an excellent package manager.

      • SatvikBeri 3 hours ago
        Julia is generally higher level than Fortran, with syntax inspired by Python/R/Matlab. We've been able to reliably hire Math PhDs and quickly get them productive in Julia, which would take much longer with Fortran.
  • cbruns 2 hours ago
    I am a MATLAB and Python user who has flirted with julia as a replacement. I don't love the business model of JuliaHub, which feels very similar to Mathworks in that all the cool toolboxes are gated behind a 'contact sales' or high priced license. The free 20 hours of cloud usage is a non-starter. Also it seems that by default, all JuliaHub usage is default cloud-based? on-prem and airgapped (something I need) is implied to be $$$.

    Open sourcing and maintaining some components of things like JuliaSim or JuliaSim Control might expand adoption of Julia for people like me. I will never be able to convince my company to pay for JuliaHub if their pricing is similar to Mathworks.

  • tmvphil 4 hours ago
    As someone working with it day to day, coming from around 18 years of mostly python, I wish I could say my experience has been great. I find myself constantly battling with the JIT and compilation and recompilation and waiting around all the time (sometimes 10 to 15 minutes for some large projects). Widespread macro usage makes stack traces much harder to read. Lack of formal interfaces means a lot of static checking is not practical. Pkg.jl is also not great, version compatibility is kind of tacked on and has odd behavior.

    Obviously there are real bright spots too, with speed, multiple dispatch, a relatively flourishing ecosystem, but overall I wouldn't pick it up for something new if given the choice. I'd use Jax or C++ extensions for performance and settle on python for high level, despite its obvious warts.

    • catgary 1 hour ago
      Yeah, Jax with Equinox, jaxtyping, and leaning hard on python’s static typing modules + typeguard lets you pretend that you have a nice little language embedded in python. I swore off Julia a few years ago.
  • Kalanos 2 hours ago
    With some serious repositioning, I think there is still an opportunity for Julia to displace Python tools like polars/pandas/numpy, airflow, and pytorch -- with a unified ecosystem that makes it easy to transition to GPU and lead a differentiable programming revolution. They have the brain power to do it.

    The future of Python's main open source data science ecosystem, numfocus, does not seem bright. Despite performance improvements, Python will always be a glue language. Python succeeds because the language and its tools are *EASY TO USE*. It has nothing to do with computer science sophistication or academic prowess - it humbly gets the job done and responds to feedback.

    In comparison to mojo/max/modular, the julia community doesn't seem to be concerned with capturing share from python or picking off its use cases. That's the real problem. There is room for more than one winner here. However, have the people that wanted to give julia a shot already done so? I hope not because there is so much richness to their community under the hood.

    • catgary 1 hour ago
      Julia has really lost the differentiable programming mindshare to JAX. I’ve spent weeks or months getting tricky gradients to work in Julia, only to have everything “just work” in JAX. The quality of the autograd is night and day, and goes down to the basic design decisions of the respective “languages” (in the sense that JAX jit compiles a subset of Python) and their intermediate representations.

      Fundamentally, when you keep a tight, purely functional core representation of your language (e.g. jaxpr’s) and decompose your autograd into two steps (forward mode and a compiler-level transpose operation) you get a system that is substantially easier to guarantee correct gradients, is much more composable, and even makes it easier to define custom gradients.

      Unfortunately, Julia didn’t actually have any proper PLT or compilers people involved in the outset. This is the original sin I see as someone with an interest in autograd. I’m sure someone more focused on type theory has a more cogent criticism of their design decisions in that domain and would identify a different “original sin”.

      In the end, I think they’ve made a nice MatLab alternative but there’s a hard upper bound on what they can reach.

      • affinepplan 8 minutes ago
        > Julia didn’t actually have any proper PLT or compilers people involved in the outset.

        while I don't disagree that currently JAX outshines Julia's autodiff options in many ways, I think comments like this are 1. false 2. rude and 3. unnecessary to make your point

    • tomnicholas1 42 minutes ago
      > The future of Python's main open source data science ecosystem, numfocus, does not seem bright. Despite performance improvements, Python will always be a glue language.

      Your first sentence is a scorching hot take, but I don't see how it's justified by your second sentence.

      The community always understood that python is a glue language, which is why the bottleneck interfaces (with IO or between array types) are implemented in lower-level languages or ABIs. The former was originally C but often is now Rust, and Apache Arrow is a great example of the latter.

      The strength of using Python is when you want to do anything beyond pure computation (e.g. networking) the rest of the world already built a package for that.

  • jakobnissen 7 hours ago
    It would be much more useful to see metrics that aren't cumulative if we're interested in growth. Cumulative measurements, by definition, will never decrease, even if Julia were to fall in popularity.
    • tpoacher 6 hours ago
      indeed; something like an h5-index would be interesting to see.
  • joshlk 6 hours ago
    According to Stackoverflow trends, Julia’s popularity is decreasing and very small

    https://trends.stackoverflow.co/?tags=julia

    • amval 5 hours ago
      That's mostly because Julia questions get answered on its Discourse or Slack. The sharp decline is due to an automatic cross-post bot that stopped working.

      No one bothered fixing it, in great part due to Discourse being the main place of discussion, as far as I know.

    • NeutralForest 5 hours ago
      Even languages like Python and Javascript who are huge show a decline after 2022 which suggests ChatGPT is probably responsible. It would be better to have some other measure imo.
      • joshlk 5 hours ago
        It measures the proportion of questions for that language out of all languages. So, if there is a general decline in Stackoverflow questions, it’s already accounted for in the metric
        • NeutralForest 5 hours ago
          There are too many confounding factors still.
    • eigenspace 2 hours ago
      Julia users don't go to Stack Overflow because we have better options.
    • mjgant 5 hours ago
      Or thats the LLM/ChatGPT effect. Can see similar downtrends with other languages
  • pjmlp 8 hours ago
    I love to see Julia growth, if nothing else by being another Dylan like take on Lisp ideas, with a JIT compiler in the box, and the community keeping the effort to overcome tooling issues despite critics.
    • tajd 6 hours ago
      Yeah it's interesting to see how it's getting on! I wrote my PhD simulation code in it from the ground up as it had nice fundamental abstractions for parallizable code. Of course now it's just Python and Scala/Java but Julia was great for my purpose.
  • NeutralForest 7 hours ago
    I like the language but I can't help but feel it missed the train and that the ergonomics improvements it offers are too small to switch over from Python.
    • dv_dt 4 hours ago
      It does feel like julia will not make the leap to displace python, but for a long time python offered too few improvements over perl, so its not completely out of the question.
    • pjmlp 7 hours ago
      Depends on which train Julia folks want to board into.
      • NeutralForest 6 hours ago
        It felt to me like they wanted to be the language for ML/DL, which they haven't achieved. They clearly have been working more towards scientific stuff + ML, all the differential equations and math packages are a testament to that (as well as the pharma stuff with Puma).

        I'm not aware of what the vision is currently tbh

        • mbauman 5 hours ago
          The key for me — as someone who has been around for a long time and is at JuliaHub — is that Julia excels most at problems that don't already have an efficient library implementation.

          If your work is well-served by existing libraries, great! There's no need to compete against something that's already working well. But that's frequently not the case for modeling, simulation, differential equations, and SciML.

          • catgary 1 hour ago
            The ODEs stuff in Julia is nice, but I think diffusers/JAX is a reasonable backbone to copy over whatever you need from there. I do think Julia is doing will in stats and has gotten some mindshare from R in that regard.

            But I think a reasonably compentent Python/JAX programmer can roll out whatever they need relatively easily (especially if you want to use the GPU). I do miss Tullio, though.

            • lagrange77 41 minutes ago
              > But I think a reasonably compentent Python/JAX programmer can roll out whatever they need relatively easily

              You mean in terms of the ODE stuff, Julia provides?

              • catgary 28 minutes ago
                Diffusers is pretty well done (I think the author was basically rewriting some Julia libraries and adapting them to JAX). I can’t imagine it being too hard to adapt most SciML ODE solvers.

                For simulations, JAX will choke on very “branchy” computations. But, honestly I’ve had very little success differentiating through those computations in the first place and they don’t run well on the GPU. Thus, I’m generally inclined to use wrappers around C++ (or ideally Rust) for those purposes (my use-case is usually some rigid-body dynamics style simulation).

        • affinepplan 6 hours ago
          I think one really good use case is complex simulations.
  • 6gvONxR4sf7o 3 hours ago
    I do scientific computing and a lisp was one of my first languages, so i feel like i ought to be the target audience, but it just never quite catches me.

    It’s almost statically compilable which has almost gotten me to pick it up a few times, but apparently it still can’t compile a lot of the most important ecosystem packages yet.

    The metaprogramming has almost gotten me to pick it up a few times, but apparently there aren’t mature static anti-footgun tools, even to the degree of mypy’s pseudo-static analysis, so I wouldn’t really want to use those in prod or even complex toy stuff.

    It’s so damned interesting though. I hope it gets some of this eventually.

  • toolslive 2 hours ago
    We do statistical modeling in Python in our company. When a statistician asked for R, I said "no, but you can have Julia". He's quite happy with it, and we're planning to move some stuff over.
  • kayson 3 hours ago
    I'm curious how people feel about the JIT compilation time vs runtime tradeoff these days. Any good recent benchmarks?