• by mccoyb on 9/12/2024, 9:28:29 PM

    Equinox has great idioms — really pioneered the Pytree perspective. Penzai is also great.

    JAX feels close to achieving a sort of high-level GPGPU ecosystem. Super fledgling — but I keep finding more little libraries that build on JAX (and can be compositionally used with other JAX libraries because of it).

    Only problem is that lots of compositional usage leads to big code and therefore big compile times for XLA.

  • by atorodius on 9/12/2024, 6:00:43 PM

    Personally a big fan of Flax. The way it separates the params and the compute graph is - imo - the right (TM) way. Saying this after many years of soing ML :)

  • by ubj on 9/12/2024, 9:21:05 PM

    There's quite a few other libraries associated with Equinox in the JAX ecosystem:

    https://github.com/patrick-kidger/equinox?tab=readme-ov-file...

    I've enjoyed using Equinox and Diffrax for performing ODE simulations. To my knowledge the only other peer library with similar capabilities is the Julia DifferentialEquations.jl package.

  • by p1esk on 9/13/2024, 3:39:44 AM

    I wish Jax had everything I need to experiment with DL models built in, natively - like Pytorch. Instead there are many third party libraries (flax, trax, haiku, this one, etc). I have no idea which one to use. This was the case when I first played with jax 5 years ago, and it’s still the case today (even worse it seems). This makes it a non starter for me.

  • by Lerc on 9/12/2024, 9:26:30 PM

    For me the questions to answer for whether or not I should bother.

    Will it try and bind me to other technologies?

    Does it work out of the box on ${GPU}?

    Is it well supported?

    Will it continue to be supported?