I’ve recently written about how-to run Facebook’s language model on PantherX and Guix. If you are interested, check it out on my blog: Run llms like ChatGPT on your laptop's CPU
The gist:
- Run a model with 7 or 13 billion parameters
- For 13B you’ll need at least 16GB of memory (8GB may do, for 7B)
- There’s no need for a powerful GPU, but a fast CPU is recommended
On my aging i7, it easily takes a minute to generate a longer sentence.
The real cool thing is, that you can teach the LLM how-to use your computer. I’ve come-up with some simple examples here. Keep in mind that this is not really “live” yet, so it will spit-out the relevant commands, but won’t execute them.
I suppose in the long run GUI’s like LXQt on PantherX may be replaced by a simple language prompt. With the help of guix’s excellent system configuration, it should be easy to apply AI to reconfigure the system in a predictable manner.