diff --git a/Code-Editors-and-LLM-Setup.md b/Code-Editors-and-LLM-Setup.md index 64deb202..bf3b47b3 100644 --- a/Code-Editors-and-LLM-Setup.md +++ b/Code-Editors-and-LLM-Setup.md @@ -1,12 +1,12 @@ -# Code Editors and LLM Setup +# Configuring Code Editors and LLMs -Karafka provides LLM-optimized documentation following the [llms.txt standard](https://llmstxt.org). This guide shows you how to configure your development environment to get better code assistance and faster problem-solving when working with Karafka. +Karafka provides LLM-optimized documentation following the [llms.txt standard](https://llmstxt.org). There are three ways to configure your development environment in order to get better code assistance and solve problems faster when working with Karafka. You can feed AI tools with Karafka documentation either by providing them with a URL, copying and pasting the docs, or setting up your code editor to do it automatically. ## Setup Methods -### Method 1: Direct URL Access (ChatGPT, Perplexity, Copilot) +### Method 1: Providing direct Karafka URL Access (ChatGPT, Perplexity, Copilot) -For LLMs with web browsing capabilities, you can simply provide the URL: +For LLMs with web browsing capabilities, insert the following prompt containing the Karafka documentation URL: ```text https://karafka.io/llms.txt @@ -16,87 +16,98 @@ What's the best way to get started with Karafka? ``` ```text -Please read https://karafka.io/llms.txt and then help me implement -a Karafka consumer with error handling and retries. +Please read https://karafka.io/llms.txt and then help me implement a Karafka consumer with error handling and retries. ``` -### Method 2: Manual Content Upload (Claude, Others) +### Method 2: Uploading Karafka documentation content For LLMs without web browsing: -1. **Get the documentation**: Visit [karafka.io/llms.txt](https://karafka.io/llms.txt) -2. **Copy content**: Copy the complete llms.txt content -3. **Paste into chat**: Provide context by pasting the content -4. **Ask questions**: Request specific implementation help +1. Get the documentation from [karafka.io/llms.txt](https://karafka.io/llms.txt) +2. Copy the complete llms.txt content. +3. Paste the the llms.txt content into the chat window. +4. Request specific implementation help -### Method 3: IDE Integration +### Method 3: Integrating IDE -Configure your code editor to use Karafka documentation as context: +To configure your code editor to use Karafka documentation as context for AI-assisted development, follow the path specific to your IDE. + +The following process describes alternative approaches depending on your IDE's capabilities. Check the documentation of your AI assistant to determine which combination of remote URLs, local files, and configuration methods work best for your development environment. **General Setup Process:** -1. **Documentation Sources**: Add `https://karafka.io/llms.txt` to your IDE's documentation sources or knowledge base -2. **Local Copy**: Download and save llms.txt in your project's docs folder -3. **Workspace Context**: Configure the AI assistant to reference project documentation -4. **Enable Integration**: Turn on "use docs for context" or similar feature in your IDE's AI settings +1. Depending on your IDE capabilities, perform one of the following steps: + - Add `https://karafka.io/llms.txt` to the documentation sources or knowledge base of your IDE + - Download and save `https://karafka.io/llms.txt` in your project documents folder +2. To reference project documentation, configure the AI assistant, unless your IDE automatically detects and uses documentation files in your project. +3. To complete the integration, enable the "use docs for context" feature in your IDE's AI settings. -**Common Configuration Locations:** +!!! note "Where to Find These Settings" -- **Settings/Preferences**: Look for "AI Assistant," "Documentation," or "Context" settings -- **Workspace Config**: Add to `.vscode/settings.json`, workspace files, or project configuration -- **Extension Settings**: Configure through AI extension preferences (Copilot, Codeium, etc.) + Most IDEs store AI configuration in Settings/Preferences under "AI Assistant" or "Documentation," in workspace config files like `.vscode/settings.json`, or through individual extension preferences. -**Implementation varies by IDE:** +## Advantages of LLM-optimized documentation -- Some IDEs automatically detect and use documentation files in your project -- Others require explicit URL configuration in settings -- Many allow both local file references and remote URL fetching -- Check your specific IDE's AI assistant documentation for exact setup steps +**When you work with traditional documentation search, you:** -## Why This Works Better +❌ Hunt through multiple pages. +❌ Miss important configuration details. +❌ Struggle with framework-specific patterns. +❌ Get generic Kafka advice instead of Karafka-specific guidance . -### Traditional Documentation Search +**When you optimize your work with AI-enhanced documentation, you:** -❌ Hunt through multiple pages -❌ Miss important configuration details -❌ Struggle with framework-specific patterns -❌ Get generic Kafka advice instead of Karafka-specific guidance +✅ Get comprehensive answers that reference multiple docs. +✅ Receive Karafka-specific best practices. +✅ Learn about Pro features when they solve your problems. +✅ Get code examples tailored to your use case. +✅ Understand the "why" behind configuration choices. -### AI-Enhanced Documentation +## Best Practices for AI Assitance -✅ Get comprehensive answers that reference multiple docs -✅ Receive Karafka-specific best practices -✅ Learn about Pro features when they solve your problems -✅ Get code examples tailored to your use case -✅ Understand the "why" behind configuration choices +### Prompts for basic AI Assistance -## Best Practices for AI Assistance +This section teaches you how to ask effective questions when working with AI assistants on basic Karafka-related problems. There are several rules to follow to optimize the replies. ### 1. **Be Specific About Your Setup** +See the difference between two levels of detail in the following prompts. + +Good: + ```text -Good: "I'm using Karafka OSS with Rails 7 in production" -Better: "I'm using Karafka Pro with Rails 7, processing 10k msgs/minute" +"I'm using Karafka OSS with Rails 7 in production" +``` + +Better: + +```text +"I'm using Karafka Pro with Rails 7, processing 10k msgs/minute" ``` ### 2. **Mention Your Experience Level** -- **New to Kafka**: Get foundational explanations -- **Kafka expert, new to Karafka**: Focus on framework-specific patterns -- **Karafka user**: Get advanced optimization tips +You can specify your expertise on one of the following levels: + +- **New to Kafka**: to get foundational explanations +- **Kafka expert, new to Karafka**: for AI to focus on framework-specific patterns +- **Karafka user**: to get advanced optimization tips ### 3. **Include Error Messages** -Paste complete error messages and stack traces for faster troubleshooting. +Feed your AI assistant with complete error messages and stack traces for faster troubleshooting. Paste them as they are. ### 4. **Ask for Code Examples** +Use the following prompt for AI to illustrate the solution in detail. + ```text -"Show me how to implement a consumer that processes user events -with error handling and proper offset management" +"Show me how to implement a consumer that processes user events with error handling and proper offset management." ``` -## Advanced AI Workflows +## Prompts for Advanced AI Workflows + +This section is a prompt engineering guide to help you cooperate with AI assistants on more complex Karafka-related problems. ### Development Planning @@ -108,8 +119,7 @@ What Karafka components should I use and how should I structure it?" ### Performance Optimization ```text -"My Karafka consumers are falling behind. I'm processing 50k messages/hour -with current config: [paste config]. How can I optimize?" +"My Karafka consumers are falling behind. I'm processing 50k messages/hour with current config: [paste config]. How can I optimize?" ``` ### Production Troubleshooting @@ -128,21 +138,23 @@ Are there any anti-patterns or optimization opportunities?" ## Pro vs OSS Guidance -Karafka's AI-optimized documentation includes guidance about both Karafka OSS and Pro features. When asking questions: + AI-optimized documentation of Karafka provides guidance about both Karafka OSS and Pro features. When you ask questions: -- **Always mention which version you're using** -- **Ask about Pro features even if you're on OSS** - the AI will explain benefits and help you evaluate upgrades -- **Specify production vs development context** - recommendations differ significantly +- Mention which version you're using. +- Ask about Pro features even if you're on OSS. AI will explain benefits and help you evaluate upgrades. +- Specify production vs development context, as recommendations differ significantly. ## Tips for Better Results +In this section, you'll find a general list of dos and don'ts for prompting your AI assistant about Karafka. + ### ✅ **Do This** -- Provide complete context about your setup -- Ask follow-up questions for clarification -- Request code examples with explanations -- Mention specific error messages or log outputs -- Ask about testing strategies for your use case +- Provide complete context about your setup. +- Ask follow-up questions for clarification. +- Request code examples with explanations. +- Mention specific error messages or log outputs. +- Ask about testing strategies for your use case. ### ❌ **Avoid This** @@ -163,14 +175,14 @@ Karafka's AI-optimized documentation includes guidance about both Karafka OSS an ### What to Verify -- **Version compatibility** - Always check against current docs -- **Production considerations** - Test recommendations in staging -- **Security implications** - Review security-related suggestions carefully -- **Performance claims** - Benchmark in your environment +- **Version compatibility** – Always check against current docs +- **Production considerations** – Test recommendations in staging +- **Security implications** – Review security-related suggestions carefully +- **Performance claims** – Benchmark in your environment ## Automatic LLM Optimization -Karafka's documentation automatically detects AI services (OpenAI, Anthropic, GitHub Copilot, Perplexity, etc.) through User-Agent detection and intelligently routes them to optimized content. +Karafka's documentation automatically detects AI services (OpenAI, Anthropic, GitHub Copilot, Perplexity, etc.) through the User-Agent detection and intelligently routes them to the optimized content. **Smart Content Routing:** @@ -179,8 +191,8 @@ Karafka's documentation automatically detects AI services (OpenAI, Anthropic, Gi This bidirectional routing provides the best possible experience for both regular users and machines: -- **40-65% token reduction** - No HTML tags, CSS classes, or JavaScript noise -- **Direct processing** - LLMs receive clean, semantic content without parsing overhead -- **Better context understanding** - More actual documentation fits within AI context windows +- **40-65% token reduction** – No HTML tags, CSS classes, or JavaScript noise +- **Direct processing** – LLMs receive clean, semantic content without parsing overhead +- **Better context understanding** – More actual documentation fits within AI context windows -The result is more accurate, framework-specific guidance when you reference Karafka documentation URLs in AI conversations. The optimization happens transparently - just use regular documentation URLs and the system handles the rest automatically. +The result is more accurate, framework-specific guidance when you reference Karafka documentation URLs in AI conversations. The optimization happens transparently as long as you use regular documentation URLs. The rest is handled by the system automatically. diff --git a/Configuration.md b/Configuration.md index b0df22da..5be0d69e 100644 --- a/Configuration.md +++ b/Configuration.md @@ -1,10 +1,10 @@ -Karafka contains multiple configuration options. To keep everything organized, all the configuration options were divided into two groups: +Karafka contains multiple configuration options. For better organization of logic and separation of concerns, all configuration options were divided into two groups: - root `karafka` options - options directly related to the Karafka framework and its components. - kafka scoped `librdkafka` options - options related to [librdkafka](Librdkafka-Configuration) -To apply all those configuration options, you need to use the ```#setup``` method from the `Karafka::App` class: +To apply these configuration options, use the ```#setup``` method from the `Karafka::App` class: ```ruby class KarafkaApp < Karafka::App @@ -20,26 +20,35 @@ end !!! note - Karafka allows you to redefine some of the settings per each topic, which means that you can have a specific custom configuration that might differ from the default one configured at the app level. This allows you for example, to connect to multiple Kafka clusters. + Karafka allows you to redefine some of the settings per each topic, which means that you can have a specific custom configuration that might differ from the default one configured at the app level. This allows you, for example, to connect to multiple Kafka clusters. -!!! note +!!! tip "Important" - kafka `client.id` is a string passed to the server when making requests. This is to track the source of requests beyond just IP/port by allowing a logical application name to be included in server-side request logging. Therefore the `client_id` should **not** be shared across multiple instances in a cluster or horizontally scaled application but distinct for each application instance. + kafka `client.id` is a string passed to the server when making requests. It is used to track the source of requests beyond just IP/port by allowing a logical application name to be included in server-side request logging. Therefore, the `client_id` should **not** be shared across multiple instances in a cluster or a horizontally scaled application but distinct for each application instance. ## Karafka configuration options -A list of all the karafka configuration options with their details and defaults can be found [here](https://github.com/karafka/karafka/blob/master/lib/karafka/setup/config.rb). +A comrehensive list of the karafka configuration options with their details and defaults can be found [here](https://github.com/karafka/karafka/blob/master/lib/karafka/setup/config.rb). ## librdkafka driver configuration options -A list of all the configuration options related to `librdkafka` with their details and defaults can be found [here](Librdkafka-Configuration). +A complete list of the configuration options related to `librdkafka` with their details and defaults can be found [here](Librdkafka-Configuration). + +## Configuring External components -## External components configurators +The `app.initialized` event hook allows you to perform additional setup and configuration tasks for external components that depend on the internal settings of Karafka. This event is executed once per process, immediat"ely after all framework components are ready, including dynamically built components. -For additional setup and/or configuration tasks, you can use the `app.initialized` event hook. It is executed **once** per process, right after all the framework components are ready (including those dynamically built). It can be used, for example, to configure some external components that need to be based on Karafka internal settings. +**Prerequisites** -Because of how the Karafka framework lifecycle works, this event is triggered after the `#setup` is done. You need to subscribe to this event before that happens, either from the `#setup` block or before. +1. Initiate the Karafka application setup +1. Verify if the external components requiring configuration are available +**Procedure** + +1. Open your Karafka bootfile (karafka.rb). +2. Find the setup block and add the event subscription inside your `setup` block +3. Inside the event handler block, implement the configuration logic for your external components using available Karafka configuration values: + ```ruby class KarafkaApp < Karafka::App setup do |config| @@ -55,22 +64,41 @@ class KarafkaApp < Karafka::App end ``` +**Result** + +Your external components will be automatically configured once per process after Karafka completes its initialization sequence. + +!!! note "Note" + The configuration will have access to all finalized Karafka settings and can reliably use framework components like loggers, metrics, and other initialized resources. + ## Environment variables settings There are several env settings you can use with Karafka. They are described under the [Env Variables](Env-Variables) section of this Wiki. -## Messages compression +## Compressing messages Kafka lets you compress your messages as they travel over the wire. By default, producer messages are sent uncompressed. -Karafka producer ([WaterDrop](https://github.com/karafka/waterdrop)) supports following compression types: +!!! note + + Karafka producer ([WaterDrop](https://github.com/karafka/waterdrop)) supports the following compression types: + + - `gzip` + - `zstd` + - `lz4` + - `snappy` + +**Prerequisites:** + +If you plan to to use `zstd`, you need to install `libzstd-dev`: + +```shell +apt-get install -y libzstd-dev +``` -- `gzip` -- `zstd` -- `lz4` -- `snappy` +**Procedure** -You can enable the compression by using the `compression.codec` and `compression.level` settings: +1. To enable the compression, use the `compression.codec` and `compression.level` settings: ```ruby class KarafkaApp < Karafka::App @@ -94,7 +122,7 @@ end ## Types of Configuration in Karafka -When working with Karafka, it is crucial to understand the different configurations available, as these settings directly influence how Karafka interacts with your application code and the underlying Kafka infrastructure. +When you work with Karafka, it is crucial to understand the different configurations available, as these settings directly influence how Karafka interacts with your application code and the underlying Kafka infrastructure. ### Root Configuration in the Setup Block @@ -111,7 +139,7 @@ class KarafkaApp < Karafka::App end ``` -### Kafka Scoped `librdkafka` Options +### Kafka-scoped `librdkafka` Options librdkafka configuration options are specified within the same setup block but scoped specifically under the `kafka` key. These settings are passed directly to the librdkafka library, the underlying Kafka client library that Karafka uses. This includes configurations for Kafka connections, such as bootstrap servers, SSL settings, and timeouts. @@ -132,7 +160,7 @@ end Karafka also supports the Admin Configs API, which is designed to view and manage configurations at the Kafka broker and topic levels. These settings are different from the client configurations (root and Kafka scoped) as they pertain to the infrastructure level of Kafka itself rather than how your application interacts with it. -Examples of these settings include: +Example settings include: - **Broker Configurations**: Like log file sizes, message sizes, and default retention policies. diff --git a/Getting-Started.md b/Getting-Started.md index 116eccbf..2e0e4a2e 100644 --- a/Getting-Started.md +++ b/Getting-Started.md @@ -1,6 +1,12 @@ ## Prerequisites -1. Verify that Apache Kafka is running. To set up Kafka, see the Instructions provided [here](Kafka-Setting-Up). +1. To verify that Apache Kafka is running, run the following command: + + ```shell + docker ps | grep kafka + ``` + +1. If it is not running, then set up Kafka. For instructions, see [Setting Up Kafka](Kafka-Setting-Up). ## For Existing Applications @@ -19,9 +25,9 @@ **Result**: All necessary files and directories are generated: - - `karafka.rb` - main file where you configure Karafka and where you define which consumers should consume what topics. - - `app/consumers/example_consumer.rb` - example consumer. - - `app/consumers/application_consumer.rb` - base consumer from which all consumers should inherit. + - `karafka.rb` — the main file where you configure Karafka and define which consumers should consume what topics + - `app/consumers/example_consumer.rb` — an example consumer + - `app/consumers/application_consumer.rb` — the base consumer from which all consumers should inherit 1. To produce test messages, open the development console and enter: @@ -75,13 +81,13 @@ bundle exec karafka install ``` - the above command will create all the necessary files and directories to get you started: + **Result**: All necessary files and directories are generated: - - `karafka.rb` - main file where you configure Karafka and where you define which consumers should consume what topics. - - `app/consumers/example_consumer.rb` - example consumer. - - `app/consumers/application_consumer.rb` - base consumer from which all consumers should inherit. + - `karafka.rb` — the main file where you configure Karafka and where you define which consumers should consume what topics. + - `app/consumers/example_consumer.rb` — an example consumer. + - `app/consumers/application_consumer.rb` — the base consumer from which all consumers should inherit. -1. After that, you can run a development console to produce messages to this example topic: +1. Run a development console to produce messages to this example topic: ```ruby # Works from any place in your code and is thread-safe @@ -101,17 +107,19 @@ {"ping"=>"pong"} [dcf3a8d8-0bd9-433a-8f63-b70a0cdb0732] Consume job for ExampleConsumer on example finished in 0ms ``` + +1. (Optional) To install and configure the Web UI, see [Getting Started with the Web UI](Web-UI-Getting-Started). ## Example applications -If you have any problems setting up Karafka or want a ready application to play with, clone our examples repository: +If you have any problems setting up Karafka or need a ready application to play with, clone our examples repository: ```shell git clone https://github.com/karafka/example-apps ./example_apps ``` -and follow the instructions from [example apps Wiki](https://github.com/karafka/example-apps/blob/master/README.md). +For instructions, see [Karafka Example Applications Wiki](https://github.com/karafka/example-apps/blob/master/README.md). ## Use cases, edge cases, and usage examples -Karafka ships with a full integration test suite that illustrates various use cases and edge cases of working with Karafka and Kafka. Please visit [this directory](https://github.com/karafka/karafka/tree/master/spec/integrations) of the Karafka repository. +Karafka ships with a full integration test suite that illustrates various use cases and edge cases of working with Karafka and Kafka. For comprehensive understanding of our framework, visit [Integrations directory](https://github.com/karafka/karafka/tree/master/spec/integrations) of the Karafka repository. diff --git a/Web-UI-Getting-Started b/Web-UI-Getting-Started new file mode 100644 index 00000000..e69de29b