Running Qwen3:30B MoE on an RTX 3070 laptop with Ollama
17 kekePower 1 6/2/2025, 8:47:16 PM blog.kekepower.com ↗
Comments (1)
jdboyd · 20h ago
I think there is a lot of value in people documenting what it takes to make models that will run on 8, 12, and 16GB GPUs work well for them, and the tools used.