Ask HN: How do I take an LLM with me when I go camping?
1 Drblessing 21 6/9/2025, 7:29:39 PM
Edit: I’ll be off-grid and offline, likely without cell service depending on how deep I go.
I would like to have an LLM with me when I go camping as a solo beginner to help. I have an Apple M1 pro with 16gb of RAM.
What's the best way to do this?
What's the best model?
Gemma3 Qwen 2.5 instruct Qwen 2.5 coder
You should take multiple distills/quants. It's good to have a high quality model sometimes but for most stuff you'll want something bellow 1GB for the fast response times. The quality is close to or better than the original chatGPT and they support extremely long contexts (if you have the memory.) It might be good to take a VLM as well (I've been happy with Qwen's VLM although it's slow.)
Gemma is Google's distillation of their larger Gemini model (at least that's my understanding.) Qwen is alibab's model. Qwen is usually very good at code, gemma tends to be a little better at everything else.
There are Deepseek distills that use either qwen or gemma as a base. I haven't been impressed with them though. TBH I've felt like most of the reasoning models are overhyped.
Any tips or fun ways you used your local model while camping?
Others will probably have better model recommendations, I am using Mistral and Gemma myself.
As for how, I have ollama for trying out local models to see what they can do: https://github.com/ollama/ollama
I've not been impressed with any of the models that can fit in 16 GB (significantly less than 16b parameters), but this is such a fast-moving area that you have to look up some online leaderboard and try out what it says is the new leader right before your trip (given your 16GB of RAM) — even a week is long enough to be out of date in this field, so the answer may well be different by the time you return.
<thinking>
I'm assuming the grizzly bear has run away -- good for you for fighting it off. The first step is going to be to stop the bleeding. You'll want a rag or a t-shirt for this..."
My advice: bring a friend. Or hire a guide. Or read a survival book.
But I watched a guy's YouTube video about surviving in the wilderness using only tools he could 3D print in-situ. It was entertaining. So I look forward to the post mortem!
I will definitely be reading a survival book. Any recommendations?
Even if you somehow find anything to query it - you would have no options to check if it's real or just another hallucination, especially on a deeply pruned model.
And off grid? Test for how long your Mac would run with the occasional LLM runs.