• by smoldesu on 5/3/2023, 9:02:07 PM

    A lot of it will revolve around Nvidia hardware that you either own or rent. I've built CPU-accelerated AI bots on free VPSes before, but it's slow and not a reflection of best-practices nor state-of-the-art inferencing. Right now, a lot of the meaningful "private cloud" AI stuff is built with extremely proprietary runtimes.

  • by ftxbro on 5/3/2023, 8:54:23 PM

    ask GPT-4