general

Reasoning AI Now Runs on Phones (900MB RAM)

New breakthrough enables advanced reasoning AI models to run efficiently on smartphones using only 900MB of RAM, making powerful artificial intelligence

Turns out reasoning models can now run on phones. Liquid AI dropped LFM2.5-1.2B-Thinking - a model that does chain-of-thought internally but only needs 900 MB of RAM.

The cool part: it actually thinks through problems step-by-step before answering, similar to what o1 does, except it runs locally on whatever device someone has sitting around.

Grab it here:

It beats Qwen3-1.7B on most benchmarks despite being 40% smaller, and handles math/tool use surprisingly well for something that fits on a phone. The internal reasoning traces mean it can work through problems systematically instead of just pattern-matching responses.

Pretty wild that what needed server racks two years ago now runs on anything with less than a gig of free memory.