coding

llama.cpp Gets Full MCP Support with Tools & UI

llama.cpp now includes complete Model Context Protocol support, enabling developers to use tools and a user interface for enhanced local language model

Someone found that llama.cpp just got a massive MCP (Model Context Protocol) implementation that turns the web UI into something way more capable.

The new stuff includes:

  • Tool calls with agentic loops - the model can actually use external tools and chain actions
  • MCP prompts - reusable prompt templates with argument forms
  • Resource browser - navigate and attach files/data with a filetree view
  • CORS proxy built-in - no more fighting with cross-origin requests

There’s also a server selector, capability cards showing what each MCP server can do, and a raw output toggle to see exactly what the model sent.

Fair warning - this is bleeding edge development work in PR #18655. The developer suggests only diving in if you’re comfortable with work-in-progress features.

Check it out at: https://github.com/ggml-org/llama.cpp/pull/18655

Pretty wild how this moves llama.cpp from just inference to actually orchestrating tools and resources.