← Index

io.github.Lykhoyda/ask-ollama

io.github.Lykhoyda/ask-ollama·v0.3.0·Other
Quality Score
80
/100

Bridge Claude with local Ollama LLMs for private AI-to-AI collaboration — no API keys, fully local

§01  Install
Claude Desktop (claude_desktop_config.json)
{
  "mcpServers": {
    "ask-ollama": {
      "command": "npx",
      "args": [
        "-y",
        "ask-ollama-mcp"
      ],
      "env": {
        "OLLAMA_HOST": "http://localhost:11434",
        "GMCPT_TIMEOUT_MS": "300000",
        "GMCPT_LOG_LEVEL": "warn"
      }
    }
  }
}
Cursor (.cursor/mcp.json)
{
  "mcpServers": {
    "ask-ollama": {
      "command": "npx",
      "args": [
        "-y",
        "ask-ollama-mcp"
      ],
      "env": {
        "OLLAMA_HOST": "http://localhost:11434",
        "GMCPT_TIMEOUT_MS": "300000",
        "GMCPT_LOG_LEVEL": "warn"
      }
    }
  }
}
Cline (cline_mcp_settings.json)
npx -y ask-ollama-mcp
§02  Environment variables
OLLAMA_HOST

Ollama server address (default: http://localhost:11434)

GMCPT_TIMEOUT_MS

Timeout for Ollama execution in milliseconds (default: 300000 = 5 minutes)

GMCPT_LOG_LEVEL

Log verbosity: debug, info, warn, error (default: warn)

§03  MCP Quality Score  ·  methodology
freshness
25
completeness
10
installability
25
documentation
15
stability
5
§04  Alternatives in Other