What does HackerNews think of localGPT?

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.

Language: Python

I think local LLMs are great for tinkerers, and with quantization can run on most modern PCs. I am not comfortable sending over my personal data over to OpenAI/Anthropic, so I've been playing around with https://github.com/PromtEngineer/localGPT/, GPT4All, etc. which keep the data all local.

Sliding window chunking, RAG, etc. seem more sophisticated than the other document LLM tools, so I would love to try this out if you ever add the ability to run LLMs locally!

I've used https://github.com/PromtEngineer/localGPT for this and thought it was nice. So I packaged it in a docker container for easy use.

docker run -itd --gpus all -p $(PORT):5111 --name llm-local-wizardlm-7b obald/llm-launcher:0.0.2

just use localhost:port in the browser and upload docs then ask questions in the gui.

Really nice for easy lookup of rules in boardgames and such. As it provides the relevant text from the docs in addition to the query answer.

https://gitlab.com/PeterHedman/llm-local