What does HackerNews think of gpt-repository-loader?
Convert code repos into an LLM prompt-friendly format. Mostly built by GPT-4.
Language:
Python
You may be interested in this: https://github.com/mpoon/gpt-repository-loader
I've had access to the 32k model for a bit and I've been using this to collect and stuff codebases into the context: https://github.com/mpoon/gpt-repository-loader
It works really well, you can tell it to implement new features or mutate parts of the code and it having the entire (or a lot of) the code in its context really improves the output.
The biggest caveat: shit is expensive! A full 32k token request will run you like $2, if you do dialog back and forth you can rack up quite the bill quickly. If it was 10x cheaper, I would use nothing else, having a large context window is that much of a game changer. As it stands, I _very_ carefully construct the prompt and move the conversation out of the 32k into the 8k model as fast as I can to save cost.
I think we’re talking about different projects. This one just gives you a text output of an entire repo.