io.github.Ainode-tech/cache-proxy
Quality Score
70
/100
LLM caching proxy (x402 USDC on Base) - exact + semantic cache. Free health.
§01 Install
Remote endpoint
Streamable HTTP / SSE endpoint. Add to any MCP client that supports remote servers.
https://cache.api.ainode.tech/mcp§03 MCP Quality Score · methodology
freshness
25
completeness
5
installability
25
documentation
10
stability
5
§04 Alternatives in AI & LLMs
OpenAI Tools MCP Server
ai.com.mcp/openai-tools
Focused MCP server for OpenAI image/audio generation (v2.0.0). Wraps endpoints via HAPI CLI.
ai.llmse/mcp
ai.llmse/mcp
Public MCP server for the LLM Search Engine
Perplexity API Platform
ai.perplexity/mcp-server
Real-time web search, reasoning, and research through Perplexity's API