Reasoning AI Now Runs on Phones (900MB RAM)
New breakthrough enables advanced reasoning AI models to run efficiently on smartphones using only 900MB of RAM, making powerful artificial intelligence
Turns out reasoning models can now run on phones. Liquid AI dropped LFM2.5-1.2B-Thinking - a model that does chain-of-thought internally but only needs 900 MB of RAM.
The cool part: it actually thinks through problems step-by-step before answering, similar to what o1 does, except it runs locally on whatever device someone has sitting around.
Grab it here:
- Hugging Face: https://huggingface.co/LiquidAI/LFM2.5-1.2B-Thinking
- LEAP: https://leap.liquid.ai/models?model=lfm2.5-1.2b-thinking
- Playground: https://playground.liquid.ai/login?callbackUrl=%2F
It beats Qwen3-1.7B on most benchmarks despite being 40% smaller, and handles math/tool use surprisingly well for something that fits on a phone. The internal reasoning traces mean it can work through problems systematically instead of just pattern-matching responses.
Pretty wild that what needed server racks two years ago now runs on anything with less than a gig of free memory.
Related Tips
Verity: Local AI Search Engine Like Perplexity
Verity is a local AI search engine that runs entirely on a user's device, providing privacy-focused searches similar to Perplexity without sending data to
ACE-Step 1.5: Free Local Music AI Rivals Suno v4/v5
ACE-Step 1.5 is an open-source music generation AI model that runs locally on consumer hardware, offering quality comparable to commercial services like Suno
MOVA: Open-Source Synchronized Video & Audio Gen
MOVA is an open-source framework that generates synchronized video and audio content simultaneously, enabling coherent multimodal media creation through