What does HackerNews think of llm?

Access large language models from the command-line

Language: Python

I can justify paying $.02 per 1k tokens and using it via the command line llm[1] tool, though. That is a far better deal.

[1] https://github.com/simonw/llm

GPT-4 isn't the default - it uses gpt-3.5-turbo by default, because that's massively cheaper.

If you want to run against GPT-4 (and your API key has access) you can pass "-4" or "--gpt4" as an option.

CORRECTION: Sorry, I was talking about my "llm" tool - https://github.com/simonw/llm - it looks like "mods" does indeed default to 4: https://github.com/charmbracelet/mods/blob/e6352fdd8487ff8fc...