Did #julialang end up kinda stalling or at least plateau-ing lower than hoped?

I know it’s got its community and dedicated users and has continued development.

But without being in that space, and speculating now at a distance, it seems it might be an interesting case study in a tech/lang that just didn’t have landing spot it could arrive at in time as the tech-world & “data science” reshuffled while julia tried to grow … ?

Can a language ever solve a “two language” problem?

@programming

  • tschenkel@mathstodon.xyz
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    @maegul @astrojuanlu @programming

    BTW, I’m not talking about “product code” in Julia, necessarily. The point is that I can stay at the “research code” level and don’t have to rewrite anything, but can just move from a toy problem with a few parameters to a full size mod with 100s of parameters.

    In my case I had a Python code solving an ODE system and a global optimisation of all parameters took a week on the 24 cores on a cluster. The same case runs over the lunch break on my laptop in Julia.

    For the most recent paper a PhD student of ours use 2000-ish GPU hours for a global sensitivity/uncertainty analysis. That would have been impossible in Python or Matlab. On a single GPU that would have taken 10 years (300 years in Python), instead of 3 months*.

    *yes, you can just through more GPUs at the problem, but then we can start the discussion about CO2 footprint being 40 or 600 times higher, respectively.

    Note: all our benchmarks are for solving ODE system. Just doing linear algebra, there isn’t really a difference in speed between Matlab, Bumpy, or Julia, because that’s all going to run on the same LA libraries.