Local, Private, and AI-Ready Code Documentation Server
RDContext is a local-first, privacy-focused tool to build AI-friendly context for any library based on its documentation files — including private repositories. Extract, index, and serve documentation for AI coding assistants, IDEs, and LLM workflows. Get better results.
RDContext MCP pulls up-to-date, version-specific, relevant documentation and code examples from a GitHub repository and makes it available to your IDE.
Add use rdcontext
to your prompt in Cursor:
Add a collapsible sidebar shadcn to the base layout of the app. use shadcn-ui, use rdcontext
RDContext fetches up-to-date code examples and documentation right into your LLM's context.
- Write your prompt
- Tell the LLM to use RDContext
- Get working code answers
No tab-switching, no hallucinated APIs that don't exist, no outdated code generations.
-
Add a Library:
Use the CLI to fetch documentation files (Markdown/MDX) from the specified repo/branch/tag/folders.
For private repos, supply a GitHub token. -
AI Extraction:
Each documentation file is processed by a LLM to extract code snippets, titles, and descriptions optimized for LLM retrieval. -
Local Indexing:
All extracted data is stored in a local libSQL database in your user data directory. -
Query & Serve:
Use the CLI to query docs/snippets, or start the MCP server to integrate with AI tools.
- CLI: Add, list and remove documentation of your tech stack using our CLI.
- MCP Server: Exposes a Model Context Protocol (MCP) server for integration with AI tools (Cursor, Windsurf, etc.).
- Local and Private: All data is stored locally using libSQL. No cloud or third-party storage required.
- Customizable: Choose which folders/files to index, and control which branches/tags to use.
- Internal Library Support: Add documentation from private GitHub repositories using a personal access token.
- Cross-Platform: Works on Linux, macOS, and Windows.
You can install RDContext globally via npm:
npm install -g rdcontext
Or use it directly with npx
(no installation required):
npx -y rdcontext <command>
You must set the OPENAI_API_KEY
environment variable for RDContext to work.
export OPENAI_API_KEY=sk-...
You can get your API key from OpenAI's dashboard.
All commands are available via the rdcontext
CLI (or npx -y rdcontext <command>
).
Index documentation from a GitHub repository (public or private):
rdcontext add <owner/repo> [--branch <branch>] [--tag <tag>] [--folders <folder1> <folder2> ...] [--token <github_token>]
--branch <branch>
: Git branch to index (default: repo default branch)--tag <tag>
: Git tag to index (mutually exclusive with--branch
)--folders <folder1> <folder2> ...
: Only index documentation in these folders (default: all)--token <github_token>
: GitHub token for private repo access
Examples:
Add up-to-date RDContext documentation:
rdcontext add cozmo-dev/rdcontext --branch main
Add documentation of a specific ShadCN UI version:
rdcontext add shadcn-ui/ui --tag [email protected]
Add documentation of a private repo of your GitHub org:
rdcontext add myorg/myrepo --branch main --folders docs src/guides --token ghp_xxx
rdcontext list
Fetch up-to-date documentation for a library, optionally filtered by topic:
rdcontext get <owner/repo> [topic] [--k <number_of_snippets>]
topic
: (optional) Focus on a specific topic (e.g., "hooks", "routing")--k
: Number of top snippets to return (default: 10)
Example:
rdcontext get shadcn-ui/ui "Select"
rdcontext rm <owner/repo>
Expose the documentation server for AI tools (Cursor, Windsurf, etc.):
rdcontext start [--transport stdio|httpStream] [--port <port>]
--transport
: Communication method (stdio
for local,httpStream
for HTTP API, default:stdio
)--port
: Port for HTTP transport (default: 3000)
RDContext exposes a Model Context Protocol (MCP) server, making it easy to integrate with popular AI coding tools and IDEs.
Requirements:
- Node.js >= v18.0.0
- Cursor, Windsurf, Claude Desktop or another MCP Client
Cursor
Go to: Settings
-> Cursor Settings
-> MCP
-> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json
file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json
in your project folder. See Cursor MCP docs for more info.
{
"mcpServers": {
"libcontext": {
"command": "libcontext",
"args": ["start"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Alternative: Without global installation
Or if you have not installed globally:
{
"mcpServers": {
"rdcontext": {
"command": "npx",
"args": ["-y", "rdcontext", "start"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Alternative: Using with Bun runtime
{
"mcpServers": {
"rdcontext": {
"command": "bunx",
"args": ["-y", "rdcontext", "start"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Windsurf
Add this to your Windsurf MCP config file. See Windsurf MCP docs for more info.
{
"mcpServers": {
"rdcontext": {
"command": "rdcontext",
"args": ["start"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Docker
If you prefer to run the MCP server in a Docker container:
-
Build the Docker Image:
Then, build the image in the project root using a tag (e.g.,
rdcontext
). Make sure Docker Desktop (or the Docker daemon) is running. Run the following command in the same directory where you saved theDockerfile
:docker build -t rdcontext .
-
Configure Your MCP Client:
Update your MCP client's configuration to use the Docker command.
Example for a cline_mcp_settings.json:
{ "mcpServers": { "RDContext": { "autoApprove": [], "disabled": false, "timeout": 60, "command": "docker", "args": ["run", "-i", "--rm", "rdcontext"], "transportType": "stdio" } } }
Note: This is an example configuration. Please refer to the specific examples for your MCP client (like Cursor, VS Code, etc.) earlier in this README to adapt the structure (e.g.,
mcpServers
vsservers
). Also, ensure the image name inargs
matches the tag used during thedocker build
command.
If you don't want to add
use rdcontext
to every prompt, you can define a simple rule in your.windsurfrules
file in Windsurf or fromCursor Settings > Rules
section in Cursor (or the equivalent in your MCP client) to auto-invoke RDContext on any code-related question:[[calls]] match = "when the user requests code examples, setup or configuration steps, or library/API documentation" tool = "rdcontext"From then on you'll get RDContext's docs in any related conversation without typing anything extra. You can add your use cases to the match part.
- Database:
Uses your OS's standard data directory (e.g.,~/.local/share/rdcontext/rdcontext.db
on Linux). - No Cloud:
All data remains on your machine unless you explicitly share it.
- Private by Default:
All indexes are stored locally.
⚠️ During the AI Extraction step the selected documentation is sent to the OpenAI API (or whichever LLM you configure).
If you need zero-egress processing, self-host the model or disable extraction. - Private Repo Support:
Your GitHub token is only used locally to fetch documentation.
- Run in development mode:
bun dev
- Permission Errors:
Ensure the data directory is writable (see error message for path). - Private Repo Issues:
Double-check your GitHub token and repo access. - OpenAI Errors:
Ensure yourOPENAI_API_KEY
environment variable is set.
MIT
Inspired by upstash/context7, but designed for local/private workflows.
Build your own private, AI-ready documentation server—locally.