io.github.TheAiSingularity/agentic-research
Local research agent that verifies its own answers. Runs on Gemma 3 4B + Ollama, $0/query.
{
"mcpServers": {
"agentic-research": {
"command": "uvx",
"args": [
"agentic-research-engine"
],
"env": {
"OPENAI_BASE_URL": "<openai_base_url>",
"OPENAI_API_KEY": "<your-openai_api_key>",
"MODEL_SYNTHESIZER": "<model_synthesizer>",
"MODEL_PLANNER": "<model_planner>",
"EMBED_MODEL": "<embed_model>",
"SEARXNG_URL": "<searxng_url>",
"LOCAL_CORPUS_PATH": "<local_corpus_path>",
"ENABLE_RERANK": "<enable_rerank>",
"ENABLE_FETCH": "<enable_fetch>"
}
}
}
}OPENAI_BASE_URLAny OpenAI-compatible endpoint. Default: OpenAI cloud. Use http://localhost:11434/v1 for Ollama.
OPENAI_API_KEYAPI key for the endpoint above. Use 'ollama' as a sentinel value when running locally against Ollama.
MODEL_SYNTHESIZERModel identifier used for the synthesize node. Defaults to 'gpt-5-mini'; set to 'gemma3:4b' for Mac-local Ollama.
MODEL_PLANNERModel for the plan / classify / critic / compress / verify nodes. Defaults to 'gpt-5-nano'.
EMBED_MODELEmbedding model identifier (for retrieval + memory). Default 'text-embedding-3-small'; use 'nomic-embed-text' on Ollama.
SEARXNG_URLBase URL of the SearXNG meta-search instance. Default http://localhost:8888.
LOCAL_CORPUS_PATHPath to an index directory built via scripts/index_corpus.py. When set, local corpus hits augment web search.
ENABLE_RERANKSet to '1' to enable the BAAI/bge-reranker-v2-m3 cross-encoder rerank stage. First run downloads ~560MB.
ENABLE_FETCHSet to '0' to skip the trafilatura full-page fetch stage. Default '1'.
AI-powered news intelligence — 21 tools for personalized monitoring, briefings, and semantic search
BuyWhere MCP server for product search, price comparison, and affiliate-ready shopping workflows.
Search and discover Agent Skills from the skills.sh registry. Powered by HAPI MCP server.