How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)

1 annjose 1 6/8/2025, 6:10:08 PM annjose.com ↗

Comments (1)

incomingpain · 8h ago
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.