by __MatrixMan__ on 3/22/2024, 4:28:23 AM
The Nix community has been at it for over a decade and they've only recently managed to repeatably build the NixOS ISO. The kind of determinism you're thinking about is a lot harder than you're giving it credit for.
It's still doable I think, and I'd love to work towards it, but who is going pay for it? Mountains of wasted compute day after day might feel like a waste to nerds like us, but to an investor it's opportunity.
Fixing it? Less opportunity.
A typical GitHub project uses CI which spins up a docker image, runs some tests and reports back what happened, on every pushed commit. Having a large number of such repositories I feel bad for all the energy spent and CO2 produced while setting up these tests (a few minutes for every commit). I appreciate the value added though: I can be certain that the code still works.
Now Docker can build in steps that depend on each other, reusing previous steps, and nix exists for making builds deterministic, and can be combined with docker. With such tech, one could build a deterministic CI system where the version is a date (e.g., start with an Ubuntu install from January 2020, fully updated on July 2023, now install my stuff). Spinning up for testing the next commit should then be extremely quick.
Furthermore, being deterministic, one could make rerunning unchanged tests contingent on which files and system calls are used by it and whether they are the same. Same tests depending on the same system and the same code must give the same result.
The benefits are obvious: deterministic testing saves computing time because the same inputs give the same result, and the feedback to the developer is quicker.
Shouldn't this be tackled ASAP by big companies or startups out of their own interest?
What's the hurdle I am missing?