← Index

io.github.LumabyteCo/clarifyprompt

io.github.LumabyteCo/clarifyprompt·v1.1.3·Other
Quality Score
84
/100

AI prompt optimization for 58+ platforms across 7 categories with custom platforms

§01  Install
Claude Desktop (claude_desktop_config.json)
{
  "mcpServers": {
    "clarifyprompt": {
      "command": "npx",
      "args": [
        "-y",
        "clarifyprompt-mcp"
      ],
      "env": {
        "LLM_API_URL": "<llm_api_url>",
        "LLM_API_KEY": "<your-llm_api_key>",
        "LLM_MODEL": "<llm_model>"
      }
    }
  }
}
Cursor (.cursor/mcp.json)
{
  "mcpServers": {
    "clarifyprompt": {
      "command": "npx",
      "args": [
        "-y",
        "clarifyprompt-mcp"
      ],
      "env": {
        "LLM_API_URL": "<llm_api_url>",
        "LLM_API_KEY": "<your-llm_api_key>",
        "LLM_MODEL": "<llm_model>"
      }
    }
  }
}
Cline (cline_mcp_settings.json)
npx -y clarifyprompt-mcp
§02  Environment variables
LLM_API_URL
required

LLM API endpoint URL (OpenAI-compatible or Anthropic)

LLM_API_KEY
secret

API key for the LLM provider (not needed for local Ollama)

LLM_MODEL
required

Model name/ID to use for optimization

§03  MCP Quality Score  ·  methodology
freshness
24
completeness
10
installability
25
documentation
15
stability
10
§04  Alternatives in Other