Skip to content

ResultadosDigitais/rdcontext

 
 

Repository files navigation

RDContext

Local, Private, and AI-Ready Code Documentation Server

RDContext is a local-first, privacy-focused tool to build AI-friendly context for any library based on its documentation files — including private repositories. Extract, index, and serve documentation for AI coding assistants, IDEs, and LLM workflows. Get better results.


Overview

RDContext MCP pulls up-to-date, version-specific, relevant documentation and code examples from a GitHub repository and makes it available to your IDE.

Add use rdcontext to your prompt in Cursor:

Add a collapsible sidebar shadcn to the base layout of the app. use shadcn-ui, use rdcontext

RDContext fetches up-to-date code examples and documentation right into your LLM's context.

  1. Write your prompt
  2. Tell the LLM to use RDContext
  3. Get working code answers

No tab-switching, no hallucinated APIs that don't exist, no outdated code generations.

How it works

  1. Add a Library:
    Use the CLI to fetch documentation files (Markdown/MDX) from the specified repo/branch/tag/folders.
    For private repos, supply a GitHub token.

  2. AI Extraction:
    Each documentation file is processed by a LLM to extract code snippets, titles, and descriptions optimized for LLM retrieval.

  3. Local Indexing:
    All extracted data is stored in a local libSQL database in your user data directory.

  4. Query & Serve:
    Use the CLI to query docs/snippets, or start the MCP server to integrate with AI tools.

Features

  • CLI: Add, list and remove documentation of your tech stack using our CLI.
  • MCP Server: Exposes a Model Context Protocol (MCP) server for integration with AI tools (Cursor, Windsurf, etc.).
  • Local and Private: All data is stored locally using libSQL. No cloud or third-party storage required.
  • Customizable: Choose which folders/files to index, and control which branches/tags to use.
  • Internal Library Support: Add documentation from private GitHub repositories using a personal access token.
  • Cross-Platform: Works on Linux, macOS, and Windows.

Installation

You can install RDContext globally via npm:

npm install -g rdcontext

Or use it directly with npx (no installation required):

npx -y rdcontext <command>

Environment Setup

You must set the OPENAI_API_KEY environment variable for RDContext to work.

export OPENAI_API_KEY=sk-...

You can get your API key from OpenAI's dashboard.


Usage

CLI Overview

All commands are available via the rdcontext CLI (or npx -y rdcontext <command>).

Add a Library

Index documentation from a GitHub repository (public or private):

rdcontext add <owner/repo> [--branch <branch>] [--tag <tag>] [--folders <folder1> <folder2> ...] [--token <github_token>]
  • --branch <branch>: Git branch to index (default: repo default branch)
  • --tag <tag>: Git tag to index (mutually exclusive with --branch)
  • --folders <folder1> <folder2> ...: Only index documentation in these folders (default: all)
  • --token <github_token>: GitHub token for private repo access

Examples:

Add up-to-date RDContext documentation:

rdcontext add cozmo-dev/rdcontext --branch main

Add documentation of a specific ShadCN UI version:

rdcontext add shadcn-ui/ui --tag [email protected]

Add documentation of a private repo of your GitHub org:

rdcontext add myorg/myrepo --branch main --folders docs src/guides --token ghp_xxx

List Indexed Libraries

rdcontext list

Get Documentation

Fetch up-to-date documentation for a library, optionally filtered by topic:

rdcontext get <owner/repo> [topic] [--k <number_of_snippets>]
  • topic: (optional) Focus on a specific topic (e.g., "hooks", "routing")
  • --k: Number of top snippets to return (default: 10)

Example:

rdcontext get shadcn-ui/ui "Select"

Remove a Library

rdcontext rm <owner/repo>

Start the MCP Server

Expose the documentation server for AI tools (Cursor, Windsurf, etc.):

rdcontext start [--transport stdio|httpStream] [--port <port>]
  • --transport: Communication method (stdio for local, httpStream for HTTP API, default: stdio)
  • --port: Port for HTTP transport (default: 3000)

MCP server

RDContext exposes a Model Context Protocol (MCP) server, making it easy to integrate with popular AI coding tools and IDEs.

Requirements:

  • Node.js >= v18.0.0
  • Cursor, Windsurf, Claude Desktop or another MCP Client
Cursor

Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server

Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.

{
  "mcpServers": {
    "libcontext": {
      "command": "libcontext",
      "args": ["start"],
      "env": {
        "OPENAI_API_KEY": "sk-..."
      }
    }
  }
}
Alternative: Without global installation

Or if you have not installed globally:

{
  "mcpServers": {
    "rdcontext": {
      "command": "npx",
      "args": ["-y", "rdcontext", "start"],
      "env": {
        "OPENAI_API_KEY": "sk-..."
      }
    }
  }
}
Alternative: Using with Bun runtime
{
  "mcpServers": {
    "rdcontext": {
      "command": "bunx",
      "args": ["-y", "rdcontext", "start"],
      "env": {
        "OPENAI_API_KEY": "sk-..."
      }
    }
  }
}
Windsurf

Add this to your Windsurf MCP config file. See Windsurf MCP docs for more info.

{
  "mcpServers": {
    "rdcontext": {
      "command": "rdcontext",
      "args": ["start"],
      "env": {
        "OPENAI_API_KEY": "sk-..."
      }
    }
  }
}
Docker

If you prefer to run the MCP server in a Docker container:

  1. Build the Docker Image:

    Then, build the image in the project root using a tag (e.g., rdcontext). Make sure Docker Desktop (or the Docker daemon) is running. Run the following command in the same directory where you saved the Dockerfile:

    docker build -t rdcontext .
  2. Configure Your MCP Client:

    Update your MCP client's configuration to use the Docker command.

    Example for a cline_mcp_settings.json:

    {
      "mcpServers": {
        "RDContext": {
          "autoApprove": [],
          "disabled": false,
          "timeout": 60,
          "command": "docker",
          "args": ["run", "-i", "--rm", "rdcontext"],
          "transportType": "stdio"
        }
      }
    }

    Note: This is an example configuration. Please refer to the specific examples for your MCP client (like Cursor, VS Code, etc.) earlier in this README to adapt the structure (e.g., mcpServers vs servers). Also, ensure the image name in args matches the tag used during the docker build command.

Tips

Add a Rule

If you don't want to add use rdcontext to every prompt, you can define a simple rule in your .windsurfrules file in Windsurf or from Cursor Settings > Rules section in Cursor (or the equivalent in your MCP client) to auto-invoke RDContext on any code-related question:

[[calls]]
match = "when the user requests code examples, setup or configuration steps, or library/API documentation"
tool  = "rdcontext"

From then on you'll get RDContext's docs in any related conversation without typing anything extra. You can add your use cases to the match part.


Data Storage

  • Database:
    Uses your OS's standard data directory (e.g., ~/.local/share/rdcontext/rdcontext.db on Linux).
  • No Cloud:
    All data remains on your machine unless you explicitly share it.

Security & Privacy

  • Private by Default:
    All indexes are stored locally.
    ⚠️ During the AI Extraction step the selected documentation is sent to the OpenAI API (or whichever LLM you configure).
    If you need zero-egress processing, self-host the model or disable extraction.
  • Private Repo Support:
    Your GitHub token is only used locally to fetch documentation.

Contributing

  • Run in development mode:
    bun dev

Troubleshooting

  • Permission Errors:
    Ensure the data directory is writable (see error message for path).
  • Private Repo Issues:
    Double-check your GitHub token and repo access.
  • OpenAI Errors:
    Ensure your OPENAI_API_KEY environment variable is set.

License

MIT


Credits

Inspired by upstash/context7, but designed for local/private workflows.


Build your own private, AI-ready documentation server—locally.

About

From GitHub docs to LLM-ready context, optimized for code generation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 96.6%
  • JavaScript 2.6%
  • Dockerfile 0.8%