• by anon389r58r58 on 8/30/2024, 4:28:03 PM

    Almost feels like a fallacy of Julia at this point, on the one hand Julia really needs a stable, high-performance AD-engine, but on the other hand it seems to be fairly easy to get a minimal AD-package off the ground.

    And so the perennial cycle continues and another Julia AD-package emerges, and ignores all/most previous work in order to claim novelty.

    Without a claim for a complete list: ReverseDiff.jl, ForwardDiff.jl, Zygote.jl, Enzyme.jl, Tangent.jl, Diffractor.jl, and many more whose name has disappeared in the short history of Julia...

  • by xyproto on 8/30/2024, 12:49:38 PM

    Why did Julia select a package naming convention that makes every project name look like a filename?

  • by xiaodai on 8/30/2024, 11:57:57 AM

    I kinda gave up on Julia for deep learning since it’s so buggy. I am using PyTorch now. Not great but at least it works!

  • by thetwentyone on 8/30/2024, 9:49:08 PM

    Odd that the author excluded ForwardDiff.jl and Zygote.jl, both of which get a lot of mileage in the Julia AD world. Nonetheless, awesome tutorial and great to see more Julia content like this!

  • by fithisux on 9/1/2024, 5:48:16 AM

    Another testament to the awesomeness of Julia

  • by huqedato on 8/30/2024, 10:34:22 AM

    Julia is a splendid, high performance language. And the most overlooked. Such a huge pity and shame that the entire current AI ecosystem is build on Python/Pytorch. Python - not a real programming language, let alone is interpreted... such a huge loss of performance besides Julia.