Only supports private / closed LLMs like OpenAI and Claud. People need to design for local LLM first, then for-profit providers.
Yeah, This can definatly be used for local models, but the problem is that most personal computers cannot host large LLMs and the cost is not cheaper than closed LLMs. But for organisations, local LLMs are a better choice.
I think local LLMs are great for tinkerers, and with quantization can run on most modern PCs. I am not comfortable sending over my personal data over to OpenAI/Anthropic, so I've been playing around with https://github.com/PromtEngineer/localGPT/, GPT4All, etc. which keep the data all local.
Sliding window chunking, RAG, etc. seem more sophisticated than the other document LLM tools, so I would love to try this out if you ever add the ability to run LLMs locally!