What does HackerNews think of cog?

Containers for machine learning

Language: Go

#33 in Deep learning
#61 in Docker
#15 in Tensorflow
the cog template is just starter code to make it super simple to deploy llama-v2 on any infrastructure of your choosing!

More about cog https://github.com/replicate/cog

Our thinking was just that a bunch of folks will want to fine-tune right away, then deploy the fine-tunes, so trying to make that easy... Or even just deploy the models-as-is on their own infra without dealing with CUDA insanity!

They have a docker image, I would start with that. It's pretty easy to setup a container in the cloud these days. They even have a tool to customize the container: https://github.com/replicate/cog
As mentioned in sibling comments, Torch is indeed the glue in this implementation. Other glues are TVM[0] and ONNX[1]

These just cover the neural net though, and there is lots of surrounding code and pre-/post-processing that isn't covered by these systems.

For models on Replicate, we use Docker, packaged with Cog for this stuff.[2] Unfortunately Docker doesn't run natively on Mac, so if we want to use the Mac's GPU, we can't use Docker.

I wish there was a good container system for Mac. Even better if it were something that spanned both Mac and Linux. (Not as far-fetched as it seems... I used to work at Docker and spent a bit of time looking into this...)

[0] https://tvm.apache.org/ [1] https://onnx.ai/ [2] https://github.com/replicate/cog