MCP Clients & Local Models

Claude Desktop, Cursor, VS Code, Tome, and Ollama — which MCP clients support what, and why local models are worth having.

March 3, 20262 min read2 / 2

When I started with MCP, one of the first things I had to get straight was what an MCP client actually is — it's any application that can connect to and consume MCP servers. The client handles the connection, sends requests, and feeds tool results back to the LLM. Several tools already ship with MCP client support:

ClientNotes
Claude DesktopFull support — tools, resources, prompts. Connects to local servers via stdio.
CursorFull MCP support inside the editor.
VS Code (Agent Mode)Full support, requires Agent Mode enabled.
Claude CodeCLI tool with MCP support.
TomeSupports tools only — no resources or prompts.

Why Claude Desktop Instead of Claude.ai

You might wonder why not just use Claude on the web. The answer is transport: MCP servers running locally on your machine can't be reached from a remote website. Claude Desktop runs as a native app, which means it can spawn local processes and communicate with them directly via stdio (standard input/output).

Claude.ai would need a remote MCP server with a public URL — fine for production, but unnecessary friction while learning.


Ollama — Running Models Locally

Ollama is a tool for managing and running open-source AI models on your own hardware, no API key required. I reach for it constantly during experimentation — you install it once and then:

Bash
# Pull a model ollama pull qwen3:8b # Run it interactively ollama run qwen3:8b # See what you have installed ollama list

Qwen 3 8B is a solid choice for MCP experimentation — it supports tool calling out of the box and runs comfortably on most modern laptops.

Ollama exposes an OpenAI-compatible API at http://localhost:11434/v1, so any client that can talk to OpenAI can point it at a local model instead.


Tome

Tome is a purpose-built MCP client with a clean interface for testing servers. It's useful, but it only supports tools — not resources or prompts. If you're working through material that covers all three primitives, you'll miss the resources and prompts sections with Tome.

Claude Desktop is the safer default for learning — it supports the full MCP spec.


Picking Your Setup

For local development, here's what I'd recommend:

  • Claude Desktop as your primary client (stdio transport, full support)
  • Ollama + Qwen 3 8B as your local model for experimentation
  • Cursor or VS Code Agent Mode when you want MCP inside your editor

The MCP server code you write is the same regardless of which client you use. The client is just the host that runs it.

Further Reading

Enjoyed this? Get more like it.

Deep dives on system design, React, web development, and personal finance — straight to your inbox. Free, always.