Atlas
Atlas is a YAML-defined semantic layer for analytics — authored by humans, consumed by AI agents.
{
"mcpServers": {
"atlas": {
"command": "npx",
"args": [
"-y",
"@useatlas/mcp"
],
"env": {
"ATLAS_DATASOURCE_URL": "<your-atlas_datasource_url>",
"ATLAS_PROVIDER": "<atlas_provider>",
"ATLAS_API_URL": "<atlas_api_url>"
}
}
}
}{
"mcpServers": {
"atlas": {
"command": "npx",
"args": [
"-y",
"@useatlas/mcp"
],
"env": {
"ATLAS_DATASOURCE_URL": "<your-atlas_datasource_url>",
"ATLAS_PROVIDER": "<atlas_provider>",
"ATLAS_API_URL": "<atlas_api_url>"
}
}
}
}npx -y @useatlas/mcpATLAS_DATASOURCE_URLAnalytics datasource URL (postgres, mysql, clickhouse, snowflake, duckdb, bigquery, salesforce). Optional — falls back to a bundled NovaMart SQLite demo when unset.
ATLAS_PROVIDERLLM provider: anthropic, openai, bedrock, ollama, openai-compatible, gateway. Optional — the MCP server does not call an LLM directly (the client does); set this only if you also run Atlas's chat or scheduler.
ATLAS_API_URLURL of a running Atlas API. Optional — defaults to http://localhost:3001. When set, the MCP server inherits the API's connections, semantic layer, and governance.
Manage your projects, debug deployment, and check analytics for any MCP server you host with Alpic
Strava MCP tools for AI: athletes, activities, segments, clubs, routes. Powered by HAPI MCP server.
Analytics for business data: upload CSV or connect GA4/GSC, run ML/stats, get HTML reports.