MCP Server
The Airweave MCP server implements the Model Context Protocol to let AI assistants search your synced data. It supports two deployment modes: local (stdio) for desktop AI clients and hosted (Streamable HTTP) for cloud platforms.
Prerequisites
Before you start you’ll need:
- A collection with data: at least one source connection must have completed its initial sync. See the Quickstart if you need to set this up.
- An API key: Create one in the Airweave dashboard under API Keys.
Local mode (Desktop AI clients)
Local mode runs the MCP server as a local process that communicates over stdio. This is the standard setup for desktop AI assistants.
Cursor
Claude Desktop
VS Code
Requirement: Cursor version 0.45.6 or later
- Open Cursor Settings
- Go to Features > MCP Servers
- Click ”+ Add new global MCP server”
- Add this configuration:
Environment variables (local mode)
Hosted mode (Cloud AI platforms)
Hosted mode runs the MCP server as a stateless HTTP service at https://mcp.airweave.ai/mcp. This is the setup for cloud-based AI platforms that need a remote MCP endpoint. The server uses the Streamable HTTP transport (MCP 2025-03-26).
Each request is fully independent — authentication and collection selection happen per-request via HTTP headers. No sessions or server-side state.
OpenAI Agent Builder
Any HTTP MCP Client
In the OpenAI Agent Builder, add a new MCP tool with:
- URL:
https://mcp.airweave.ai/mcp - Headers:
X-API-Key: Your Airweave API keyX-Collection-Readable-ID: Your collection’s readable ID
Authentication (hosted mode)
The API key can be provided via either of these methods (checked in order):
Collection selection (hosted mode)
Endpoints
Available tools
The MCP server exposes two tools to AI assistants:
search-{collection}
The primary search tool. The tool name includes the collection ID so the AI assistant knows which dataset it’s searching (e.g., search-my-docs).
Parameters:
The MCP server uses the legacy search parameter names for backwards compatibility. These are automatically mapped to the new API parameter names by the SDK. See the Search documentation for the mapping between legacy and new parameter names.
Natural language examples — the AI assistant will map these to the correct parameters automatically:
- “Find the first 5 results” →
limit: 5 - “Show me results 11-20” →
limit: 10, offset: 10 - “Give me a summary” →
response_type: "completion" - “Find the most recent documents” →
recency_bias: 0.8 - “Find high-quality results only” →
score_threshold: 0.8 - “Use keyword search” →
search_method: "keyword" - “Don’t expand my query” →
expansion_strategy: "no_expansion" - “Disable reranking” →
enable_reranking: false
get-config
Returns the current server configuration: collection ID, base URL, API key status, and available tools. No parameters.
Architecture
- Local mode: One MCP server process per user. API key and collection are set via environment variables. Communication over stdio.
- Hosted mode: Stateless HTTP server. A fresh MCP server instance is created for each request. API key and collection come from request headers.
In both modes, the MCP server is a thin wrapper around the Airweave SDK. It validates parameters, calls the search API, and formats results for the AI assistant.
Self-hosting
If you’re running Airweave on your own infrastructure, point the MCP server at your instance:
Local mode:
Hosted mode (Docker):
The hosted mode Docker image exposes port 8080 with a health check at /health.