Model Context Protocol (MCP) integration for Claude Context - A powerful MCP server that enables AI assistants and agents to index and search codebases using semantic search.
📖 New to Claude Context? Check out the main project README for an overview and setup instructions.
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Before using the MCP server, make sure you have:
- API key for your chosen embedding provider (OpenAI, VoyageAI, Gemini, or Ollama setup)
- Milvus vector database (local or cloud)
💡 Setup Help: See the main project setup guide for detailed installation instructions.
Claude Context MCP supports multiple embedding providers. Choose the one that best fits your needs:
💡 Tip: You can also use global environment variables for easier configuration management across different MCP clients.
# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
EMBEDDING_PROVIDER=OpenAI
1. OpenAI Configuration (Default)
OpenAI provides high-quality embeddings with excellent performance for code understanding.
# Required: Your OpenAI API key
OPENAI_API_KEY=sk-your-openai-api-key
# Optional: Specify embedding model (default: text-embedding-3-small)
EMBEDDING_MODEL=text-embedding-3-small
# Optional: Custom API base URL (for Azure OpenAI or other compatible services)
OPENAI_BASE_URL=https://api.openai.com/v1
Available Models:
text-embedding-3-small
(1536 dimensions, faster, lower cost)text-embedding-3-large
(3072 dimensions, higher quality)text-embedding-ada-002
(1536 dimensions, legacy model)
Getting API Key:
- Visit OpenAI Platform
- Sign in or create an account
- Generate a new API key
- Set up billing if needed
2. VoyageAI Configuration
VoyageAI offers specialized code embeddings optimized for programming languages.
# Required: Your VoyageAI API key
VOYAGEAI_API_KEY=pa-your-voyageai-api-key
# Optional: Specify embedding model (default: voyage-code-3)
EMBEDDING_MODEL=voyage-code-3
Available Models:
voyage-code-3
(1024 dimensions, optimized for code)voyage-3
(1024 dimensions, general purpose)voyage-3-lite
(512 dimensions, faster inference)
Getting API Key:
- Visit VoyageAI Console
- Sign up for an account
- Navigate to API Keys section
- Create a new API key
3. Gemini Configuration
Google's Gemini provides competitive embeddings with good multilingual support.
# Required: Your Gemini API key
GEMINI_API_KEY=your-gemini-api-key
# Optional: Specify embedding model (default: gemini-embedding-001)
EMBEDDING_MODEL=gemini-embedding-001
Available Models:
gemini-embedding-001
(3072 dimensions, latest model)
Getting API Key:
- Visit Google AI Studio
- Sign in with your Google account
- Go to "Get API key" section
- Create a new API key
4. Ollama Configuration (Local/Self-hosted)
Ollama allows you to run embeddings locally without sending data to external services.
# Required: Specify which Ollama model to use
EMBEDDING_MODEL=nomic-embed-text
# Optional: Specify Ollama host (default: http://127.0.0.1:11434)
OLLAMA_HOST=http://127.0.0.1:11434
Available Models:
nomic-embed-text
(768 dimensions, recommended for code)mxbai-embed-large
(1024 dimensions, higher quality)all-minilm
(384 dimensions, lightweight)
Setup Instructions:
- Install Ollama from ollama.ai
- Pull the embedding model:
ollama pull nomic-embed-text
- Ensure Ollama is running:
ollama serve
Claude Context needs a vector database. You can sign up on Zilliz Cloud to get an API key.
Copy your Personal Key to replace your-zilliz-cloud-api-key
in the configuration examples.
MILVUS_TOKEN=your-zilliz-cloud-api-key
You can set the embedding batch size to optimize the performance of the MCP server, depending on your embedding model throughput. The default value is 100.
EMBEDDING_BATCH_SIZE=512
You can configure custom file extensions and ignore patterns globally via environment variables:
# Additional file extensions to include beyond defaults
CUSTOM_EXTENSIONS=.vue,.svelte,.astro,.twig
# Additional ignore patterns to exclude files/directories
CUSTOM_IGNORE_PATTERNS=temp/**,*.backup,private/**,uploads/**
These settings work in combination with tool parameters - patterns from both sources will be merged together.
Qwen Code
Create or edit the ~/.qwen/settings.json
file and add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Cursor
Go to: Settings
-> Cursor Settings
-> MCP
-> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json
file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json
in your project folder. See Cursor MCP docs for more info.
OpenAI Configuration (Default):
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "OpenAI",
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
VoyageAI Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "VoyageAI",
"VOYAGEAI_API_KEY": "your-voyageai-api-key",
"EMBEDDING_MODEL": "voyage-code-3",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Gemini Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Gemini",
"GEMINI_API_KEY": "your-gemini-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Ollama Configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Ollama",
"EMBEDDING_MODEL": "nomic-embed-text",
"OLLAMA_HOST": "http://127.0.0.1:11434",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Void
Go to: Settings
-> MCP
-> Add MCP Server
Add the following configuration to your Void MCP settings:
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Claude Desktop
Add to your Claude Desktop configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Claude Code
Use the command line interface to add the Claude Context MCP server:
# Add the Claude Context MCP server
claude mcp add claude-context -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_TOKEN=your-zilliz-cloud-api-key -- npx @zilliz/claude-context-mcp@latest
See the Claude Code MCP documentation for more details about MCP server management.
Windsurf
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
VS Code
The Claude Context MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Cherry Studio
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
- Navigate to Settings → MCP Servers → Add Server.
- Fill in the server details:
- Name:
claude-context
- Type:
STDIO
- Command:
npx
- Arguments:
["@zilliz/claude-context-mcp@latest"]
- Environment Variables:
OPENAI_API_KEY
:your-openai-api-key
MILVUS_TOKEN
:your-zilliz-cloud-api-key
- Name:
- Save the configuration to activate the server.
Cline
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
-
Open Cline and click on the MCP Servers icon in the top navigation bar.
-
Select the Installed tab, then click Advanced MCP Settings.
-
In the
cline_mcp_settings.json
file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
- Save the file.
Augment
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
-
Click the hamburger menu.
-
Select Settings.
-
Navigate to the Tools section.
-
Click the + Add MCP button.
-
Enter the following command:
npx @zilliz/claude-context-mcp@latest
-
Name the MCP: Claude Context.
-
Click the Add button.
- Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
- Select Edit Settings
- Under Advanced, click Edit in settings.json
- Add the server configuration to the
mcpServers
array in theaugment.advanced
object
"augment.advanced": {
"mcpServers": [
{
"name": "claude-context",
"command": "npx",
"args": ["-y", "@zilliz/claude-context-mcp@latest"]
}
]
}
Gemini CLI
Gemini CLI requires manual configuration through a JSON file:
-
Create or edit the
~/.gemini/settings.json
file. -
Add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
- Save the file and restart Gemini CLI to apply the changes.
Roo Code
Roo Code utilizes a JSON configuration file for MCP servers:
-
Open Roo Code and navigate to Settings → MCP Servers → Edit Global Config.
-
In the
mcp_settings.json
file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@zilliz/claude-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
- Save the file to activate the server.
Other MCP Clients
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
npx @zilliz/claude-context-mcp@latest
- 🔌 MCP Protocol Compliance: Full compatibility with MCP-enabled AI assistants and agents
- 🔍 Semantic Code Search: Natural language queries to find relevant code snippets
- 📁 Codebase Indexing: Index entire codebases for fast semantic search
- 🔄 Auto-Sync: Automatically detects and synchronizes file changes to keep index up-to-date
- 🧠 AI-Powered: Uses OpenAI embeddings and Milvus vector database
- ⚡ Real-time: Interactive indexing and searching with progress feedback
- 🛠️ Tool-based: Exposes three main tools via MCP protocol
Index a codebase directory for semantic search.
Parameters:
path
(required): Absolute path to the codebase directory to indexforce
(optional): Force re-indexing even if already indexed (default: false)splitter
(optional): Code splitter to use - 'ast' for syntax-aware splitting with automatic fallback, 'langchain' for character-based splitting (default: "ast")customExtensions
(optional): Additional file extensions to include beyond defaults (e.g., ['.vue', '.svelte', '.astro']). Extensions should include the dot prefix or will be automatically added (default: [])ignorePatterns
(optional): Additional ignore patterns to exclude specific files/directories beyond defaults (e.g., ['static/', '*.tmp', 'private/']) (default: [])
Search the indexed codebase using natural language queries.
Parameters:
path
(required): Absolute path to the codebase directory to search inquery
(required): Natural language query to search for in the codebaselimit
(optional): Maximum number of results to return (default: 10, max: 50)
Clear the search index for a specific codebase.
Parameters:
path
(required): Absolute path to the codebase directory to clear index for
This package is part of the Claude Context monorepo. Please see:
- Main Contributing Guide - General contribution guidelines
- MCP Package Contributing - Specific development guide for this package
- @zilliz/claude-context-core - Core indexing engine used by this MCP server
- VSCode Extension - Alternative VSCode integration
- Model Context Protocol - Official MCP documentation
MIT - See LICENSE for details