Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Recoverable keys feature for the dashboard #2110

Open
wants to merge 22 commits into
base: main
Choose a base branch
from

Conversation

harshsbhat
Copy link
Contributor

@harshsbhat harshsbhat commented Sep 18, 2024

What does this PR do?

Fixes #2097

This PR introduces the feature to create recoverable keys from the dashboard itself. Adding a toggle recoverable in the dashboard creates keys if enabled encrypts the data and then stores the encrypted data. This also introduces an internal package named @unkey/vault which allows the dashboard to interact with the vault.

https://www.loom.com/share/27f3a1b58e5a4d478b1510a80f679558?sid=e8faf269-7f80-4b8c-9229-7b55c547ebf2

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • Chore (refactoring code, technical debt, workflow improvements)
  • Enhancement (small improvements)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

Required

  • Filled out the "How to test" section in this PR
  • Read Contributing Guide
  • Self-reviewed my own code
  • Commented on my code in hard-to-understand areas
  • Ran pnpm build
  • Ran pnpm fmt
  • Checked for warnings, there are none
  • Removed all console.logs
  • Merged the latest changes from main onto my branch with git pull origin main
  • My changes don't cause any responsiveness issues

Appreciated

  • If a UI change was made: Added a screen recording or screenshots to this PR
  • Updated the Unkey Docs if changes were necessary

Summary by CodeRabbit

Summary by CodeRabbit

  • New Features

    • Introduced a new option for recoverable keys during key creation, enhancing user control over key management.
    • Added a user-friendly toggle for the recoverEnabled feature in the key creation form, with informative descriptions about security implications.
    • Enhanced encryption and storage options for keys, improving security and recovery options.
    • Integrated a new dependency for better vault management within the application.
  • Bug Fixes

    • Improved error handling and validation related to the new recoverEnabled feature.

Copy link

changeset-bot bot commented Sep 18, 2024

⚠️ No Changeset found

Latest commit: 2bcfb89

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link

vercel bot commented Sep 18, 2024

@harshsbhat is attempting to deploy a commit to the Unkey Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

coderabbitai bot commented Sep 18, 2024

Walkthrough

Walkthrough

The changes introduce a new feature in the API key creation process, allowing users to set a recoverEnabled option. This feature is implemented in the CreateKey component, where a toggle switch is added to the UI for users to enable or disable key recovery. Additionally, the backend logic in the createKey procedure is updated to handle this new parameter, facilitating the storage of encrypted keys when recovery is enabled.

Changes

Files Change Summary
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx Added recoverEnabled boolean property to the form schema and UI, capturing its state on form submission.
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/page.tsx Introduced checkStoreEncryptedKeys parameter to the CreateKeypage function, enhancing the context for the CreateKey component.
apps/dashboard/lib/trpc/routers/key/create.ts Introduced recoverEnabled parameter in the createKey procedure, modifying logic for key recovery and encryption.
apps/dashboard/package.json Added @unkey/vault dependency to support enhanced vault management features.

Assessment against linked issues

Objective Addressed Explanation
Allow setting recoverable: true when creating keys in the dashboard (Issue #2097)
Update the trpc handler to handle the new recoverEnabled logic (Issue #2097)
Ensure compatibility with externalId for keys created in the dashboard (Issue #2097) The changes do not address the need for externalId compatibility.

Possibly related PRs

  • feat: Show keyId after creating a new key #2107: This PR modifies the CreateKey component, enhancing the user interface for displaying API key information, which is closely related to the overall functionality of key management introduced in the main PR.

Suggested reviewers

  • mcstepp
  • perkinsjr
  • chronark

Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between ca22641 and 2bcfb89.

Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
Files selected for processing (1)
  • apps/dashboard/package.json (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • apps/dashboard/package.json

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    -- I pushed a fix in commit <commit_id>, please review it.
    -- Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    -- @coderabbitai generate unit testing code for this file.
    -- @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    -- @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    -- @coderabbitai read src/utils.ts and generate unit testing code.
    -- @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    -- @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

github-actions bot commented Sep 18, 2024

Thank you for following the naming conventions for pull request titles! 🙏

Copy link

vercel bot commented Sep 18, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
dashboard ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 19, 2024 4:48pm
planetfall ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 19, 2024 4:48pm
play ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 19, 2024 4:48pm
workflows ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 19, 2024 4:48pm
www ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 19, 2024 4:48pm

@perkinsjr
Copy link
Collaborator

@harshsbhat this currently has a frozen lock file. Preventing it from building.

ERR_PNPM_OUTDATED_LOCKFILE  Cannot install with "frozen-lockfile" because pnpm-lock.yaml is not up to date with internal/vault/package.json

Just go ahead and delete the pnpm-lock.yaml and run pnpm i and then commit the lock file.

Copy link
Collaborator

@perkinsjr perkinsjr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See note

@harshsbhat
Copy link
Contributor Author

See note

Done!

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Outside diff range and nitpick comments (2)
internal/vault/package.json (1)

1-15: LGTM! The package.json file is well-structured and properly configured.

The file correctly sets up the essential metadata for the @unkey/vault package, including the name, version, main entry point, and type definitions. The use of TypeScript and the inclusion of necessary dev dependencies are appropriate for the package's development.

Consider adding the following enhancements to further improve the package:

  1. Add a scripts section to define common tasks like building, testing, and linting the package. For example:
"scripts": {
  "build": "tsc",
  "test": "jest",
  "lint": "eslint src/**/*.ts"
}
  1. Include a README.md file in the package's root directory to provide documentation and usage instructions for the package. This will help users understand how to integrate and utilize the @unkey/vault package effectively.
apps/dashboard/lib/trpc/routers/key/create.ts (1)

1-1: Remove unused import.

The Api import from @/app/(app)/settings/root-keys/[keyId]/permissions/api is not used in the changed code. Please remove it to keep the code clean.

Apply this diff to remove the unused import:

-import { Api } from "@/app/(app)/settings/root-keys/[keyId]/permissions/api";
 import { db, schema } from "@/lib/db";
Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 5fb8bb9 and c8d08d0.

Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
Files selected for processing (5)
  • apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (5 hunks)
  • apps/dashboard/lib/trpc/routers/key/create.ts (3 hunks)
  • apps/dashboard/package.json (1 hunks)
  • internal/vault/package.json (1 hunks)
  • internal/vault/src/index.ts (1 hunks)
Additional comments not posted (13)
internal/vault/src/index.ts (5)

1-27: LGTM!

The type definitions for encryption and decryption requests and responses are well-structured and follow a consistent naming convention. The separation of single and bulk operations enhances clarity and usability.


29-31: LGTM!

The RequestContext type provides a structured way to include a unique requestId with each request, which can be useful for logging, tracing, and debugging purposes.


33-59: LGTM!

The Vault class and its encrypt method are well-implemented. The class encapsulates the encryption functionality and provides a clean interface for other modules to use. The encrypt method follows a standard pattern for making an HTTP request and includes appropriate error handling.


61-81: LGTM!

The encryptBulk method of the Vault class is well-implemented and follows a similar pattern to the encrypt method. It handles bulk encryption requests and includes appropriate error handling.


83-100: LGTM!

The decrypt method of the Vault class is well-implemented and follows a similar pattern to the encrypt and encryptBulk methods. It handles decryption requests and includes appropriate error handling.

apps/dashboard/package.json (1)

56-56: LGTM!

The addition of the @unkey/vault dependency aligns with the stated PR objectives and the feature implementation for recoverable keys. It will facilitate the necessary interaction between the dashboard and the vault.

apps/dashboard/lib/trpc/routers/key/create.ts (2)

21-21: LGTM!

The addition of the optional recoverEnabled parameter to the createKey procedure is a good way to introduce the new recoverable keys feature without breaking existing functionality.


106-121: Excellent work on the key encryption and storage implementation!

The conditional block for handling key encryption and storage when recoverEnabled is true and storeEncryptedKeys is enabled in the key auth is well-structured and follows best practices:

  • Using environment variables for the vault configuration keeps sensitive information secure.
  • Generating a unique request ID for each encryption operation aids in auditing and debugging.
  • Storing the encrypted key and its metadata in a separate encryptedKeys table ensures secure and efficient key recovery when needed.

Great job on this implementation!

apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (5)

81-81: LGTM!

The addition of the recoverEnabled property to the form schema is a good way to introduce the key recoverability feature. Setting the default value to false ensures that keys are not recoverable by default, which is a good security practice.


167-167: LGTM!

Adding the recoverEnabled property to the form's default values with a value of false is consistent with the addition of this property to the form schema. It ensures that the form's initial state has recoverEnabled set to false.


208-208: LGTM!

Including the recoverEnabled property in the object passed to key.mutateAsync when the form is submitted ensures that the value of recoverEnabled is passed to the server when creating a new key. This change is consistent with the addition of the recoverEnabled property to the form schema and default values.


297-297: LGTM!

Resetting the recoverEnabled property to false when the user clicks the "Create another key" button ensures that the recoverEnabled property is reset to its default value when creating a new key. This change is consistent with the addition of the recoverEnabled property to the form schema and default values.


812-859: LGTM!

The addition of the key recoverability card component is consistent with the introduction of the key recoverability feature. The toggle switch allows users to easily enable/disable key recoverability. The message displayed when key recoverability is enabled is important to inform users about the security implications of this feature. Including a link to the documentation is helpful for users who want to learn more about key recoverability.

@harshsbhat
Copy link
Contributor Author

See note

After I push the pnpm lock file for some reason it shows 17000+ lines changed. Is that supposed to look like that?

apps/dashboard/lib/trpc/routers/key/create.ts Outdated Show resolved Hide resolved
};
const requestId = crypto.randomUUID();
const context: RequestContext = { requestId };
const vaultRes = await vault.encrypt(context, encryptReq);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens in vault creation fails here? You aren't handling the negative path:

For example encryption fails but the user receives no error or success message, The key gets upserted into the database but without encryption, on the frontend to inform them that it failed.

[
    {
        "error": {
            "json": {
                "message": "Unable to encrypt, fetch error: error code: 1016",
                "code": -32603,
                "data": {
                    "code": "INTERNAL_SERVER_ERROR",
                    "httpStatus": 500,
                    "stack": "Error: Unable to encrypt, fetch error: error code: 1016\n    at (vc/edge/function:298:40314)\n    at (vc/edge/function:298:43255)\n    at (vc/edge/function:295:126266)\n    at (vc/edge/function:295:125662)\n    at (vc/edge/function:295:125662)\n    at (vc/edge/function:295:125662)\n    at (vc/edge/function:295:125985)\n    at (vc/edge/function:337:12967)\n    at (vc/edge/function:337:14421)",
                    "path": "key.create"
                }
            }
        }
    }
]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image
New changes will display the toast like this if the encryption fails

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between babbde4 and 4b74619.

Files selected for processing (2)
  • apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (6 hunks)
  • apps/dashboard/lib/trpc/routers/key/create.ts (3 hunks)
Additional comments not posted (6)
apps/dashboard/lib/trpc/routers/key/create.ts (3)

1-8: LGTM!

The new imports are relevant and necessary for the changes introduced in this PR.


21-21: LGTM!

The new optional recoverEnabled parameter is a relevant addition to support the recoverable keys feature.


106-143: Address the following concerns to improve error handling and user experience:

  1. The code does not defensively handle the case when recoverEnabled is true but storeEncryptedKeys is false. In such cases, the user may believe they have successfully created a recoverable key, but in reality, the key is not recoverable. Consider informing the user that recoverable keys are not supported for their workspace to avoid confusion.

  2. The error handling for encryption failure does not inform the user about the failure on the frontend, and the key still gets inserted into the database without encryption. This can lead to potential security risks and inconsistencies. Improve the error handling by:

    • Informing the user on the frontend about the encryption failure.
    • Preventing the insertion of the unencrypted key into the database when encryption fails.
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (3)

81-81: LGTM!

The addition of the recoverEnabled property to the form schema with a default value of false aligns with the PR objective and follows security best practices.


167-167: LGTM!

The recoverEnabled property is correctly initialized in the form's default values and included in the form submission payload, ensuring consistency with the form schema and correct propagation of the user's selection.

Also applies to: 302-302


817-864: Excellent work on the "Recoverable" feature card!

The addition of the "Recoverable" card with the toggle switch aligns perfectly with the PR objective and provides a clear way for users to control the recoverability of their keys.

The informational message is a great touch, as it effectively communicates the security implications of recoverable keys and directs users to the relevant documentation for more details. Mentioning that recoverable keys are securely stored in an encrypted vault helps address potential security concerns.

Overall, this implementation is well-thought-out and enhances the user experience while prioritizing security.

* fix: default ratelimits

* revert

* fix: cache ratelimits on cloudflare correctly

* chore: remove logs

* chore: remove log

* perf: remove unnecessary switch

* fix: track isolate start time

* test: tighten lower ratelimit threshold

* fix: only cache ratelimit blocks

* chore: sync lockfile

* test: improve accuracy of lower limit calculation in rate limit tests

* fix: address rabbit suggestions
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Outside diff range and nitpick comments (3)
apps/agent/pkg/testutil/attack.go (1)

44-52: Potential blocking on responses channel

In the worker goroutines, if the responses channel is unbuffered and the caller does not consume the responses promptly, the workers might block when sending to responses <- fn():

for range ticks {
    responses <- fn()
    wg.Done()
}

Ensure that the caller processes the responses to prevent blocking. Alternatively, consider buffering the responses channel to accommodate bursts:

- responses := make(chan Response)
+ responses := make(chan Response, bufferSize)
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (2)

35-35: Remove commented-out code to keep the codebase clean

In line 35, there are commented-out values in the nKeys slice. It's best practice to remove or uncomment this code to enhance readability and maintainability.

Apply this diff to remove the commented code:

-for _, nKeys := range []int{1} { //, 3, 10, 1000} {
+for _, nKeys := range []int{1} {

57-79: Ensure consistent use of acronym casing in variable names

The variable externalId is used in several places. In Go, it's conventional to use ID in all caps for acronyms. Consider renaming externalId to externalID for consistency and readability.

Apply this diff to update the variable name:

-externalId := uid.New("testuser")
+externalID := uid.New("testuser")

...

_, err = sdk.Identities.CreateIdentity(ctx, operations.CreateIdentityRequestBody{
-   ExternalID: externalId,
+   ExternalID: externalID,
    Meta: map[string]any{
        "email": "[email protected]",
    },
})

...

_, err = sdk.Identities.UpdateIdentity(ctx, operations.UpdateIdentityRequestBody{
-   ExternalID: unkey.String(externalId),
+   ExternalID: unkey.String(externalID),
    Ratelimits: []operations.UpdateIdentityRatelimits{inferenceLimit},
})
Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 596b315 and 7ffdbc4.

Files selected for processing (12)
  • apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (1 hunks)
  • apps/agent/integration/identities/token_ratelimits_test.go (2 hunks)
  • apps/agent/integration/keys/ratelimits_test.go (1 hunks)
  • apps/agent/pkg/circuitbreaker/lib.go (1 hunks)
  • apps/agent/pkg/clickhouse/flush.go (0 hunks)
  • apps/agent/pkg/testutil/attack.go (1 hunks)
  • apps/agent/services/eventrouter/service.go (0 hunks)
  • apps/api/src/pkg/middleware/init.ts (2 hunks)
  • apps/api/src/pkg/middleware/metrics.ts (1 hunks)
  • apps/api/src/pkg/ratelimit/client.ts (7 hunks)
  • apps/api/src/routes/v1_keys_verifyKey.ratelimit_accuracy.test.ts (1 hunks)
  • apps/api/src/routes/v1_ratelimit_limit.accuracy.test.ts (1 hunks)
Files not reviewed due to no reviewable changes (2)
  • apps/agent/pkg/clickhouse/flush.go
  • apps/agent/services/eventrouter/service.go
Files skipped from review due to trivial changes (2)
  • apps/agent/pkg/circuitbreaker/lib.go
  • apps/api/src/pkg/middleware/metrics.ts
Additional comments not posted (14)
apps/api/src/routes/v1_keys_verifyKey.ratelimit_accuracy.test.ts (1)

97-97: Verify the reasoning behind the change in the lowerLimit calculation.

The change in the lowerLimit calculation from exactLimit * 0.75 to exactLimit * 0.95 increases the threshold for the minimum number of passed requests that are considered acceptable during the test.

This change may make the test more lenient and potentially allow more test cases to pass. However, it may also reduce the effectiveness of the test in catching rate limiting issues, as it allows a higher margin of error.

Please provide the reasoning behind this change and confirm that it aligns with the intended behavior of the rate limiting functionality being tested.

apps/api/src/routes/v1_ratelimit_limit.accuracy.test.ts (1)

100-100: Tightened acceptance criteria for the rate limit test.

The change to the lowerLimit calculation, increasing the multiplier from 0.75 to 0.95, makes the test more stringent by requiring a higher percentage of requests to pass. This improves the test's ability to catch issues where the rate limiting implementation is too permissive.

However, be aware that this change may also lead to more test failures if the rate limiting accuracy is not consistently high enough to allow 95% of the expected requests. Monitor the test results closely to ensure that the implementation meets this new, stricter threshold.

apps/api/src/pkg/middleware/init.ts (2)

29-29: Introduce isolateCreatedAt variable to track isolate creation time.

Declaring the isolateCreatedAt variable at the module level with an initial value of undefined is a good approach. It allows for:

  1. Persisting the value across multiple invocations of the init function.
  2. Checking if the variable has been assigned a value later in the code.

This sets the foundation for consistently tracking the isolate creation time.


40-44: Assign isolateCreatedAt only once and set it in the context.

The conditional assignment of isolateCreatedAt ensures that it is set only once when it is undefined. This has several benefits:

  1. It accurately captures the initial creation time of the isolate.
  2. It prevents the timestamp from being reset on subsequent invocations of init.
  3. It maintains a consistent value for isolateCreatedAt throughout the isolate's lifecycle.

Setting isolateCreatedAt in the context object c provides a centralized way to access this value across the codebase, promoting consistency and reusability.

apps/agent/integration/identities/token_ratelimits_test.go (2)

31-34: LGTM!

The changes simplify the SDK initialization code by removing the conditional logic and directly calling the unkey.New function with the server URL and security key as parameters. This enhances readability without affecting the functionality.


46-46: LGTM!

The change in the email address from "[email protected]" to "[email protected]" reflects a shift towards using a generic test email rather than a specific user's email. This is a good practice for integration tests and does not affect the functionality of the test.

apps/api/src/pkg/ratelimit/client.ts (5)

17-17: LGTM!

The addition of the blocked property to the cache entry is a valid change that aligns with the goal of enhancing the caching mechanism to handle blocked requests more effectively.


23-23: LGTM!

Updating the constructor to accept a cache parameter with the modified cache entry type is a necessary change to maintain consistency with the updated cache property in the class.


Line range hint 38-57: LGTM!

Renaming the setCacheMax method to setCache and updating it to accept the blocked parameter directly aligns with the updated cache structure. Setting the cache entry directly with the blocked state simplifies the cache update logic and improves readability. The rest of the method logic remains intact, ensuring effective cache size management.


125-126: LGTM!

Modifying the logic in the limit method to utilize the new cache structure and check the blocked property of the cached entry aligns with the goal of efficiently handling blocked requests. If the blocked property is true, the method returns a response indicating that the request is blocked, without making an unnecessary request to the durable object. This change improves performance and resource utilization.


172-172: LGTM!

Updating the calls to the setCache method with the new cache entry, including the blocked state, based on the results of rate limit checks, ensures that the cache is correctly updated. Setting the blocked state to true when a request exceeds the rate limit and to false otherwise, allows the rate limiter to efficiently handle future requests based on the cached blocked state. These changes align with the overall goal of enhancing the rate limiting functionality.

Also applies to: 183-183, 206-206

apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (3)

105-121: Assess potential for data races in concurrent execution

If attack.Attack runs the provided function concurrently, ensure that shared variables are accessed safely. Specifically, verify that sdk and api are safe for concurrent use, or protect them with appropriate synchronization.

Check the documentation or implementation of sdk and api for thread safety.


71-75: Confirm time unit consistency for rate limit durations

The Duration field in operations.UpdateIdentityRatelimits is set using time.Minute.Milliseconds(). Ensure that the backend expects the duration in milliseconds and that this aligns with the API specifications.

Check the API documentation to confirm that Duration should be in milliseconds.


135-145: Verify correctness of rate limit calculations

The calculation of exactLimit, upperLimit, and lowerLimit may need verification to ensure they accurately reflect the intended rate limits over the test duration. Potential integer division and rounding issues could affect the assertions.

Run the following script to check the calculation accuracy:

Verification successful

Rate limit calculations verified as correct

After thorough examination of the source code, I can confirm that the rate limit calculations in the test are accurate. Here's a breakdown of the key points:

  • The inferenceLimit.Limit is set to 100 requests per minute.
  • The test uses variable durations (1 minute and 5 minutes) to ensure the calculations work correctly for different time spans.
  • The exactLimit calculation correctly scales the limit based on the test duration:
    exactLimit := int(inferenceLimit.Limit) * int(tc.testDuration/(time.Duration(inferenceLimit.Duration)*time.Millisecond))

This approach ensures that the rate limit is properly adjusted for the test duration, maintaining the correct requests-per-minute ratio. The use of upper and lower limits (120% and 100% of exactLimit, respectively) provides an appropriate tolerance for test variations.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the rate limit calculations in the test code.

# Expected: The computed limits match the theoretical limits based on duration and rate.

# Extract inferenceLimit.Limit and inferenceLimit.Duration
LIMIT=$(grep -A3 'inferenceLimit := operations.UpdateIdentityRatelimits' identities_ratelimits_accuracy_test.go | grep 'Limit' | awk '{print $2}' | tr -d ',')
DURATION_MS=$(grep -A3 'inferenceLimit := operations.UpdateIdentityRatelimits' identities_ratelimits_accuracy_test.go | grep 'Duration' | awk '{print $2}' | tr -d ',')
TEST_DURATION=$(grep -A2 'testDuration' identities_ratelimits_accuracy_test.go | grep 'testDuration' | awk '{print $2}' | tr -d ',')

# Calculate expected exactLimit
EXACT_LIMIT=$(echo "$LIMIT * ($TEST_DURATION / ($DURATION_MS / 1000))" | bc)
echo "Calculated exactLimit: $EXACT_LIMIT"

# Verify the calculations in the test code match this value.

Length of output: 856


Script:

#!/bin/bash
# Search for the correct file name
echo "Searching for the test file:"
fd -t f -e go ratelimits_accuracy_test

# List the content of the directory containing the file (if found)
echo -e "\nListing directory content:"
fd -t f -e go ratelimits_accuracy_test -x dirname {} | xargs ls -la

# Display the relevant parts of the file content
echo -e "\nRelevant file content:"
fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A5 'inferenceLimit :='
fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A2 'testDuration'

# Extract and display the required values
echo -e "\nExtracted values:"
fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A3 'inferenceLimit :=' | grep 'Limit' | awk '{print "LIMIT=" $2}' | tr -d ','
fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A3 'inferenceLimit :=' | grep 'Duration' | awk '{print "DURATION_MS=" $2}' | tr -d ','
fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A2 'testDuration' | grep 'testDuration' | awk '{print "TEST_DURATION=" $2}' | tr -d ','

Length of output: 2426

apps/agent/pkg/testutil/attack.go Outdated Show resolved Hide resolved
func Attack[Response any](t *testing.T, rate Rate, duration time.Duration, fn func() Response) <-chan Response {
t.Log("attacking")
wg := sync.WaitGroup{}
workers := 256
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider making workers configurable

Currently, the number of worker goroutines is hardcoded to 256:

workers := 256

To make the function more flexible and adaptable to different testing environments, consider making workers a parameter or deriving it from the available CPU cores:

workers := runtime.NumCPU()

Don't forget to import the runtime package:

import (
    "fmt"
    "runtime"
    "sync"
    "testing"
    "time"
)

apps/agent/integration/keys/ratelimits_test.go Outdated Show resolved Hide resolved
Comment on lines 114 to 116
if total < lowerLimit {
lowerLimit = total
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Logical error in lowerLimit adjustment

Adjusting lowerLimit to total when total < lowerLimit may lead to incorrect assertions, potentially masking test failures.

At lines 114-116:

if total < lowerLimit {
	lowerLimit = total
}

This adjustment allows passed to always be greater than or equal to lowerLimit, which can render the assertion at line 120 ineffective. Instead, consider reviewing the logic to ensure that the lowerLimit accurately reflects the expected minimum number of successful requests.

For example, you might want to remove the adjustment or revise it based on the test's intent.

apps/agent/integration/keys/ratelimits_test.go Outdated Show resolved Hide resolved
Comment on lines 14 to 16
attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil"
"github.com/unkeyed/unkey/apps/agent/pkg/uid"
"github.com/unkeyed/unkey/apps/agent/pkg/util"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update import paths to use module references instead of relative paths

The import paths for internal packages use relative paths, which may cause issues with module resolution. Consider updating them to use module references.

Apply this diff to update the import paths:

-attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil"
-"github.com/unkeyed/unkey/apps/agent/pkg/uid"
-"github.com/unkeyed/unkey/apps/agent/pkg/util"
+attack "unkey/apps/agent/pkg/testutil"
+"unkey/apps/agent/pkg/uid"
+"unkey/apps/agent/pkg/util"

Ensure that the module path unkey is correctly specified in your go.mod file.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil"
"github.com/unkeyed/unkey/apps/agent/pkg/uid"
"github.com/unkeyed/unkey/apps/agent/pkg/util"
attack "unkey/apps/agent/pkg/testutil"
"unkey/apps/agent/pkg/uid"
"unkey/apps/agent/pkg/util"

Comment on lines 121 to 150
if (input.recoverEnabled && keyAuth?.storeEncryptedKeys) {
const vault = new Vault(env().AGENT_URL, env().AGENT_TOKEN);
const encryptReq: EncryptRequest = {
keyring: workspace?.id,
data: key,
};
const requestId = crypto.randomUUID();
const context: RequestContext = { requestId };
const vaultRes = await vault.encrypt(context, encryptReq).catch((_err) => {
throw new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message: "Encryption Failed. Please contact support using [email protected]",
});
});
await db
.insert(schema.encryptedKeys)
.values({
keyId: keyId,
workspaceId: workspace?.id,
encrypted: vaultRes.encrypted,
encryptionKeyId: vaultRes.keyId,
})
.catch((_err) => {
throw new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message:
"We are unable to store encrypt the key. Please contact support using [email protected]",
});
});
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't resolve the issue of key inserts even if the encryption fails.

You need to handle rollback of the key creation if the encryption fails. You either need to wrap the transaction in the create transaction, or you need to handle the rollback manually. Otherwise an end user will have extra keys for no reasons.

It should be something like:

Happy path:

  1. Create Key
  2. Encrypt key
  3. Return success

Unsuccessful:

  1. Create Key
  2. Encyption or insert fails
  3. Make sure key is removed from the DB
  4. Return error.

Some reading.

https://orm.drizzle.team/docs/transactions#transactions

Copy link
Collaborator

@perkinsjr perkinsjr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See my note

@harshsbhat
Copy link
Contributor Author

See my note

Done! 👍

@harshsbhat
Copy link
Contributor Author

See my note

Done! 👍

I have also tested it. Now it won't create extra keys even if the encryption or storing inside encryption fails.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Allow setting recoverable: true when creating keys in the dashboard
3 participants