Are there any "small" LM can be deployed locally that understands formats and specs and generate outputs?

Alpaca [1], perhaps. It's based on facebook's model (LLaMA) and its been trained on a conversational style, same as chat gpt. I don't know if it can produce code, though.

[1] https://github.com/antimatter15/alpaca.cpp