MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher
  • Ollama installed and running (https://ollama.com/download)
  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models
  • show_model - Get detailed information about a specific model
  • ask_model - Ask a question to a specified model

License

MIT

Related in Development - Secure MCP Servers

ServerSummaryActions
Tree-Hugger-JSAn MCP (Model Context Protocol) server that provides AI agents with powerful JavaScript/TypeScript c...View
Liana-MCPNatural language interface for scRNA-Seq analysis with Liana through MCP.View
MCP Prompt Server🚀 重要升级通知!View
Unity-MCPA bridge between Unity and AI assistants using the Model Context Protocol (MCP).View
AI Agent with MCPEste projeto implementa um agente de IA que utiliza o Model Context Protocol (MCP) para interagir co...View
UnityNaturalMCPView