• Top
  • New

Ask HN: How do you load your code base as context window in ChatGPT?

by alfonsodev on 1/7/2025, 5:37:28 PM with 2 comments
I've read O1 has 200k tokens limit as per context window [1], my code base is about 177k, I could generate a prompt with code2prompt [2] tool and load my code base as a prompt, but:

- it doesn't allow me to attach text files when o1 is selected.

- Pasting the whole prompt is the text area freezes the browser for a while but when si done, I can't submitted as the send button is disabled.

- When creating a project I can attach files but then I can't select o1 model.

I'm on the fence to buy pro subscription, if only I could use o1 with my code base as context window.

When they say 200k tokens it means trough the api? But then I'd incur in extra costs which seems odd since I'm already paying the subscription, and it's not an automation use case.

I would appreciate if anyone can share their experience working with large context window for specific use case of a code base.

Thanks!

- [1] https://platform.openai.com/docs/models#o1

- [2] https://github.com/mufeedvh/code2prompt

  • by cloudking on 1/7/2025, 5:48:09 PM

    For whole codebase prompting, you'll have a much better time with https://www.cursor.com/

    You can use OpenAI, Anthropic, Google etc as the LLM provider. o1 is supported

    https://docs.cursor.com/chat/codebase

  • by dsrtslnd23 on 1/11/2025, 8:07:11 AM

    I use o1 with https://aider.chat