What does HackerNews think of fauxpilot?

FauxPilot - an open-source GitHub Copilot server

Language: Python

From https://github.com/moyix/fauxpilot

> if you have two NVIDIA RTX 3080 GPUs, you should be able to run the 6B model by putting half on each GPU.

Thats quite excessive, but nice that it can be ran locally.

> Note that the VRAM requirements listed by setup.sh are total -- if you have multiple GPUs, you can split the model across them.

This might not work with the upcoming RTX 40x0 GPUs unless they have enough memory, as they abolished NVLink in that series.

Edit: I'm now seeing there are smaller models too. Going to give it a try tomorrow.

Salesforce CodeGen (particularly the 16B-multi and 16B-mono models) is pretty good already and can be used with FauxPilot [1] to get an open Copilot-like experience with local compute :) I am also very excited about the upcoming BigCode project though, which is maybe what you're thinking of?

Disclaimer: I am naturally biased since I made FauxPilot ;)

[1] https://github.com/moyix/fauxpilot

[2] https://www.bigcode-project.org/

Actually, a project called Fauxpilot came out a couple of weeks ago and seems to be gaining steam.

https://github.com/moyix/fauxpilot