Every day for the last few months I've been wishing I had a cabin in the woods with zero modern technology. A big shelf of books, my acoustic guitars and some comfy places to sleep.
I’m literally planning to do this in a few days to get some solid focus time in. The thought of the disadvantage of not having an AI model to support already has crossed my mind (no internet access in the woods)
It’s noted that this editor supports offline - does this mean that the AI features also run offline? Or a limited version?
Unfortunately, everything but the AI works offline. Though, maybe that's a feature if you're planning a more mellow retreat :)
Have you considered a limited LLM that could run locally?
> planning a more mellow retreat
The objective here is to forcefully going to where internet is impossible (no phone reception, I don’t have starlink) with the objective of focused productive output with limited distractions.
The idea came to mind after reading about John Carmack doing this for a week, diving into AI using nothing but classic text books and papers as reference material to work off.
EDIT: here is the HN thread on Carmack’s week long retreat:
> Have you considered a limited LLM that could run locally?
I think there are two main issues here. LLM are large (the name even hints at it ;) ) and the smaller ones (still, multiple GB) are really, really bad.
Edit: and uses a ton of memory, either RAM if CPU or VRAM if GPU.