• by ilaksh on 8/17/2023, 3:16:03 PM

    I was using Codex for a website builder service. And really freaking out because of the lack of response from OpenAI about increasing the rate limit. But after the ChatGPT API came out, I came to the conclusion that ChatGPT was just as good or better than Codex anyway. So I immediately switched to the Chat API.

    I also wanted to use OpenAI's Edit endpoint but now that the models are faster that is less of an issue because it's not a big deal to rewrite most files that fit in context. Also with something like the new function call support, you can make something like a find and replace function call, or find and replace between start and end.

    I think the biggest issue people are going to have is that within a few months OpenAI will release fine-tuning for the ChatGPT models, and that probably will work significantly better than the "in context training" (i.e., adding relevant help to prompts per user query) that people are using now, at least for some use cases. So there will be a lot of projects that just finished getting vector search to enhance prompts working and then that will immediately be kind of obsolete. Although it's probably going to be expensive.

    But overall I am pleased with the rate of progress and updates from OpenAI.

    I am still hopeful that within not too many more months we will finally have really strong code generation from open models. There are definitely some better reasoning open models coming out lately but not quite there yet.