HackerNews
Favorites
Projects
Topics
czyhandsome
Do you support custom LLMs?
jackmpcollins
At the moment only those that support the OpenAI Chat API, with function calling for the structured outputs. For example you can use
LocalAI
[0][1] to run models locally.
[0]
https://github.com/go-skynet/LocalAI
[1]
https://localai.io/features/openai-functions/