← Index

io.github.egoughnour/massive-context-mcp

io.github.egoughnour/massive-context-mcp·v3.0.1·Other
Quality Score
80
/100

Handles 10M+ token contexts with chunking, sub-queries, and local Ollama inference.

§01  Install
Claude Desktop (uvx)
{
  "mcpServers": {
    "massive-context-mcp": {
      "command": "uvx",
      "args": [
        "massive-context-mcp"
      ],
      "env": {
        "RLM_DATA_DIR": "<rlm_data_dir>",
        "OLLAMA_URL": "<ollama_url>"
      }
    }
  }
}
§02  Environment variables
RLM_DATA_DIR

Directory for storing context data

OLLAMA_URL

URL for Ollama server (default: http://localhost:11434)

RLM_DATA_DIR

Directory for storing context data

OLLAMA_URL

URL for Ollama server (default: http://localhost:11434)

§03  MCP Quality Score  ·  methodology
freshness
20
completeness
10
installability
25
documentation
15
stability
10
§04  Alternatives in Other