-
Notifications
You must be signed in to change notification settings - Fork 285
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Recoverable keys feature for the dashboard #2110
base: main
Are you sure you want to change the base?
Conversation
|
@harshsbhat is attempting to deploy a commit to the Unkey Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughWalkthroughThe changes introduce a new feature in the API key creation process, allowing users to set a Changes
Assessment against linked issues
Possibly related PRs
Suggested reviewers
Recent review detailsConfiguration used: CodeRabbit UI Files ignored due to path filters (1)
Files selected for processing (1)
Files skipped from review as they are similar to previous changes (1)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Thank you for following the naming conventions for pull request titles! 🙏 |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@harshsbhat this currently has a frozen lock file. Preventing it from building.
Just go ahead and delete the pnpm-lock.yaml and run pnpm i and then commit the lock file. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See note
Done! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Outside diff range and nitpick comments (2)
internal/vault/package.json (1)
1-15
: LGTM! Thepackage.json
file is well-structured and properly configured.The file correctly sets up the essential metadata for the
@unkey/vault
package, including the name, version, main entry point, and type definitions. The use of TypeScript and the inclusion of necessary dev dependencies are appropriate for the package's development.Consider adding the following enhancements to further improve the package:
- Add a
scripts
section to define common tasks like building, testing, and linting the package. For example:"scripts": { "build": "tsc", "test": "jest", "lint": "eslint src/**/*.ts" }
- Include a
README.md
file in the package's root directory to provide documentation and usage instructions for the package. This will help users understand how to integrate and utilize the@unkey/vault
package effectively.apps/dashboard/lib/trpc/routers/key/create.ts (1)
1-1
: Remove unused import.The
Api
import from@/app/(app)/settings/root-keys/[keyId]/permissions/api
is not used in the changed code. Please remove it to keep the code clean.Apply this diff to remove the unused import:
-import { Api } from "@/app/(app)/settings/root-keys/[keyId]/permissions/api"; import { db, schema } from "@/lib/db";
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
Files selected for processing (5)
- apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (5 hunks)
- apps/dashboard/lib/trpc/routers/key/create.ts (3 hunks)
- apps/dashboard/package.json (1 hunks)
- internal/vault/package.json (1 hunks)
- internal/vault/src/index.ts (1 hunks)
Additional comments not posted (13)
internal/vault/src/index.ts (5)
1-27
: LGTM!The type definitions for encryption and decryption requests and responses are well-structured and follow a consistent naming convention. The separation of single and bulk operations enhances clarity and usability.
29-31
: LGTM!The
RequestContext
type provides a structured way to include a uniquerequestId
with each request, which can be useful for logging, tracing, and debugging purposes.
33-59
: LGTM!The
Vault
class and itsencrypt
method are well-implemented. The class encapsulates the encryption functionality and provides a clean interface for other modules to use. Theencrypt
method follows a standard pattern for making an HTTP request and includes appropriate error handling.
61-81
: LGTM!The
encryptBulk
method of theVault
class is well-implemented and follows a similar pattern to theencrypt
method. It handles bulk encryption requests and includes appropriate error handling.
83-100
: LGTM!The
decrypt
method of theVault
class is well-implemented and follows a similar pattern to theencrypt
andencryptBulk
methods. It handles decryption requests and includes appropriate error handling.apps/dashboard/package.json (1)
56-56
: LGTM!The addition of the
@unkey/vault
dependency aligns with the stated PR objectives and the feature implementation for recoverable keys. It will facilitate the necessary interaction between the dashboard and the vault.apps/dashboard/lib/trpc/routers/key/create.ts (2)
21-21
: LGTM!The addition of the optional
recoverEnabled
parameter to thecreateKey
procedure is a good way to introduce the new recoverable keys feature without breaking existing functionality.
106-121
: Excellent work on the key encryption and storage implementation!The conditional block for handling key encryption and storage when
recoverEnabled
is true andstoreEncryptedKeys
is enabled in the key auth is well-structured and follows best practices:
- Using environment variables for the vault configuration keeps sensitive information secure.
- Generating a unique request ID for each encryption operation aids in auditing and debugging.
- Storing the encrypted key and its metadata in a separate
encryptedKeys
table ensures secure and efficient key recovery when needed.Great job on this implementation!
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (5)
81-81
: LGTM!The addition of the
recoverEnabled
property to the form schema is a good way to introduce the key recoverability feature. Setting the default value tofalse
ensures that keys are not recoverable by default, which is a good security practice.
167-167
: LGTM!Adding the
recoverEnabled
property to the form's default values with a value offalse
is consistent with the addition of this property to the form schema. It ensures that the form's initial state hasrecoverEnabled
set tofalse
.
208-208
: LGTM!Including the
recoverEnabled
property in the object passed tokey.mutateAsync
when the form is submitted ensures that the value ofrecoverEnabled
is passed to the server when creating a new key. This change is consistent with the addition of therecoverEnabled
property to the form schema and default values.
297-297
: LGTM!Resetting the
recoverEnabled
property tofalse
when the user clicks the "Create another key" button ensures that therecoverEnabled
property is reset to its default value when creating a new key. This change is consistent with the addition of therecoverEnabled
property to the form schema and default values.
812-859
: LGTM!The addition of the key recoverability card component is consistent with the introduction of the key recoverability feature. The toggle switch allows users to easily enable/disable key recoverability. The message displayed when key recoverability is enabled is important to inform users about the security implications of this feature. Including a link to the documentation is helpful for users who want to learn more about key recoverability.
After I push the pnpm lock file for some reason it shows 17000+ lines changed. Is that supposed to look like that? |
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx
Outdated
Show resolved
Hide resolved
}; | ||
const requestId = crypto.randomUUID(); | ||
const context: RequestContext = { requestId }; | ||
const vaultRes = await vault.encrypt(context, encryptReq); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens in vault creation fails here? You aren't handling the negative path:
For example encryption fails but the user receives no error or success message, The key gets upserted into the database but without encryption, on the frontend to inform them that it failed.
[
{
"error": {
"json": {
"message": "Unable to encrypt, fetch error: error code: 1016",
"code": -32603,
"data": {
"code": "INTERNAL_SERVER_ERROR",
"httpStatus": 500,
"stack": "Error: Unable to encrypt, fetch error: error code: 1016\n at (vc/edge/function:298:40314)\n at (vc/edge/function:298:43255)\n at (vc/edge/function:295:126266)\n at (vc/edge/function:295:125662)\n at (vc/edge/function:295:125662)\n at (vc/edge/function:295:125662)\n at (vc/edge/function:295:125985)\n at (vc/edge/function:337:12967)\n at (vc/edge/function:337:14421)",
"path": "key.create"
}
}
}
}
]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (2)
- apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (6 hunks)
- apps/dashboard/lib/trpc/routers/key/create.ts (3 hunks)
Additional comments not posted (6)
apps/dashboard/lib/trpc/routers/key/create.ts (3)
1-8
: LGTM!The new imports are relevant and necessary for the changes introduced in this PR.
21-21
: LGTM!The new optional
recoverEnabled
parameter is a relevant addition to support the recoverable keys feature.
106-143
: Address the following concerns to improve error handling and user experience:
The code does not defensively handle the case when
recoverEnabled
is true butstoreEncryptedKeys
is false. In such cases, the user may believe they have successfully created a recoverable key, but in reality, the key is not recoverable. Consider informing the user that recoverable keys are not supported for their workspace to avoid confusion.The error handling for encryption failure does not inform the user about the failure on the frontend, and the key still gets inserted into the database without encryption. This can lead to potential security risks and inconsistencies. Improve the error handling by:
- Informing the user on the frontend about the encryption failure.
- Preventing the insertion of the unencrypted key into the database when encryption fails.
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx (3)
81-81
: LGTM!The addition of the
recoverEnabled
property to the form schema with a default value offalse
aligns with the PR objective and follows security best practices.
167-167
: LGTM!The
recoverEnabled
property is correctly initialized in the form's default values and included in the form submission payload, ensuring consistency with the form schema and correct propagation of the user's selection.Also applies to: 302-302
817-864
: Excellent work on the "Recoverable" feature card!The addition of the "Recoverable" card with the toggle switch aligns perfectly with the PR objective and provides a clear way for users to control the recoverability of their keys.
The informational message is a great touch, as it effectively communicates the security implications of recoverable keys and directs users to the relevant documentation for more details. Mentioning that recoverable keys are securely stored in an encrypted vault helps address potential security concerns.
Overall, this implementation is well-thought-out and enhances the user experience while prioritizing security.
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx
Outdated
Show resolved
Hide resolved
apps/dashboard/app/(app)/apis/[apiId]/keys/[keyAuthId]/new/client.tsx
Outdated
Show resolved
Hide resolved
* fix: default ratelimits * revert * fix: cache ratelimits on cloudflare correctly * chore: remove logs * chore: remove log * perf: remove unnecessary switch * fix: track isolate start time * test: tighten lower ratelimit threshold * fix: only cache ratelimit blocks * chore: sync lockfile * test: improve accuracy of lower limit calculation in rate limit tests * fix: address rabbit suggestions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 12
Outside diff range and nitpick comments (3)
apps/agent/pkg/testutil/attack.go (1)
44-52
: Potential blocking onresponses
channelIn the worker goroutines, if the
responses
channel is unbuffered and the caller does not consume the responses promptly, the workers might block when sending toresponses <- fn()
:for range ticks { responses <- fn() wg.Done() }Ensure that the caller processes the responses to prevent blocking. Alternatively, consider buffering the
responses
channel to accommodate bursts:- responses := make(chan Response) + responses := make(chan Response, bufferSize)apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (2)
35-35
: Remove commented-out code to keep the codebase cleanIn line 35, there are commented-out values in the
nKeys
slice. It's best practice to remove or uncomment this code to enhance readability and maintainability.Apply this diff to remove the commented code:
-for _, nKeys := range []int{1} { //, 3, 10, 1000} { +for _, nKeys := range []int{1} {
57-79
: Ensure consistent use of acronym casing in variable namesThe variable
externalId
is used in several places. In Go, it's conventional to useID
in all caps for acronyms. Consider renamingexternalId
toexternalID
for consistency and readability.Apply this diff to update the variable name:
-externalId := uid.New("testuser") +externalID := uid.New("testuser") ... _, err = sdk.Identities.CreateIdentity(ctx, operations.CreateIdentityRequestBody{ - ExternalID: externalId, + ExternalID: externalID, Meta: map[string]any{ "email": "[email protected]", }, }) ... _, err = sdk.Identities.UpdateIdentity(ctx, operations.UpdateIdentityRequestBody{ - ExternalID: unkey.String(externalId), + ExternalID: unkey.String(externalID), Ratelimits: []operations.UpdateIdentityRatelimits{inferenceLimit}, })
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (12)
- apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (1 hunks)
- apps/agent/integration/identities/token_ratelimits_test.go (2 hunks)
- apps/agent/integration/keys/ratelimits_test.go (1 hunks)
- apps/agent/pkg/circuitbreaker/lib.go (1 hunks)
- apps/agent/pkg/clickhouse/flush.go (0 hunks)
- apps/agent/pkg/testutil/attack.go (1 hunks)
- apps/agent/services/eventrouter/service.go (0 hunks)
- apps/api/src/pkg/middleware/init.ts (2 hunks)
- apps/api/src/pkg/middleware/metrics.ts (1 hunks)
- apps/api/src/pkg/ratelimit/client.ts (7 hunks)
- apps/api/src/routes/v1_keys_verifyKey.ratelimit_accuracy.test.ts (1 hunks)
- apps/api/src/routes/v1_ratelimit_limit.accuracy.test.ts (1 hunks)
Files not reviewed due to no reviewable changes (2)
- apps/agent/pkg/clickhouse/flush.go
- apps/agent/services/eventrouter/service.go
Files skipped from review due to trivial changes (2)
- apps/agent/pkg/circuitbreaker/lib.go
- apps/api/src/pkg/middleware/metrics.ts
Additional comments not posted (14)
apps/api/src/routes/v1_keys_verifyKey.ratelimit_accuracy.test.ts (1)
97-97
: Verify the reasoning behind the change in thelowerLimit
calculation.The change in the
lowerLimit
calculation fromexactLimit * 0.75
toexactLimit * 0.95
increases the threshold for the minimum number of passed requests that are considered acceptable during the test.This change may make the test more lenient and potentially allow more test cases to pass. However, it may also reduce the effectiveness of the test in catching rate limiting issues, as it allows a higher margin of error.
Please provide the reasoning behind this change and confirm that it aligns with the intended behavior of the rate limiting functionality being tested.
apps/api/src/routes/v1_ratelimit_limit.accuracy.test.ts (1)
100-100
: Tightened acceptance criteria for the rate limit test.The change to the
lowerLimit
calculation, increasing the multiplier from 0.75 to 0.95, makes the test more stringent by requiring a higher percentage of requests to pass. This improves the test's ability to catch issues where the rate limiting implementation is too permissive.However, be aware that this change may also lead to more test failures if the rate limiting accuracy is not consistently high enough to allow 95% of the expected requests. Monitor the test results closely to ensure that the implementation meets this new, stricter threshold.
apps/api/src/pkg/middleware/init.ts (2)
29-29
: IntroduceisolateCreatedAt
variable to track isolate creation time.Declaring the
isolateCreatedAt
variable at the module level with an initial value ofundefined
is a good approach. It allows for:
- Persisting the value across multiple invocations of the
init
function.- Checking if the variable has been assigned a value later in the code.
This sets the foundation for consistently tracking the isolate creation time.
40-44
: AssignisolateCreatedAt
only once and set it in the context.The conditional assignment of
isolateCreatedAt
ensures that it is set only once when it isundefined
. This has several benefits:
- It accurately captures the initial creation time of the isolate.
- It prevents the timestamp from being reset on subsequent invocations of
init
.- It maintains a consistent value for
isolateCreatedAt
throughout the isolate's lifecycle.Setting
isolateCreatedAt
in the context objectc
provides a centralized way to access this value across the codebase, promoting consistency and reusability.apps/agent/integration/identities/token_ratelimits_test.go (2)
31-34
: LGTM!The changes simplify the SDK initialization code by removing the conditional logic and directly calling the
unkey.New
function with the server URL and security key as parameters. This enhances readability without affecting the functionality.
46-46
: LGTM!The change in the email address from "[email protected]" to "[email protected]" reflects a shift towards using a generic test email rather than a specific user's email. This is a good practice for integration tests and does not affect the functionality of the test.
apps/api/src/pkg/ratelimit/client.ts (5)
17-17
: LGTM!The addition of the
blocked
property to the cache entry is a valid change that aligns with the goal of enhancing the caching mechanism to handle blocked requests more effectively.
23-23
: LGTM!Updating the constructor to accept a
cache
parameter with the modified cache entry type is a necessary change to maintain consistency with the updatedcache
property in the class.
Line range hint
38-57
: LGTM!Renaming the
setCacheMax
method tosetCache
and updating it to accept theblocked
parameter directly aligns with the updated cache structure. Setting the cache entry directly with theblocked
state simplifies the cache update logic and improves readability. The rest of the method logic remains intact, ensuring effective cache size management.
125-126
: LGTM!Modifying the logic in the
limit
method to utilize the new cache structure and check theblocked
property of the cached entry aligns with the goal of efficiently handling blocked requests. If theblocked
property is true, the method returns a response indicating that the request is blocked, without making an unnecessary request to the durable object. This change improves performance and resource utilization.
172-172
: LGTM!Updating the calls to the
setCache
method with the new cache entry, including theblocked
state, based on the results of rate limit checks, ensures that the cache is correctly updated. Setting theblocked
state totrue
when a request exceeds the rate limit and tofalse
otherwise, allows the rate limiter to efficiently handle future requests based on the cachedblocked
state. These changes align with the overall goal of enhancing the rate limiting functionality.Also applies to: 183-183, 206-206
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go (3)
105-121
: Assess potential for data races in concurrent executionIf
attack.Attack
runs the provided function concurrently, ensure that shared variables are accessed safely. Specifically, verify thatsdk
andapi
are safe for concurrent use, or protect them with appropriate synchronization.Check the documentation or implementation of
sdk
andapi
for thread safety.
71-75
: Confirm time unit consistency for rate limit durationsThe
Duration
field inoperations.UpdateIdentityRatelimits
is set usingtime.Minute.Milliseconds()
. Ensure that the backend expects the duration in milliseconds and that this aligns with the API specifications.Check the API documentation to confirm that
Duration
should be in milliseconds.
135-145
: Verify correctness of rate limit calculationsThe calculation of
exactLimit
,upperLimit
, andlowerLimit
may need verification to ensure they accurately reflect the intended rate limits over the test duration. Potential integer division and rounding issues could affect the assertions.Run the following script to check the calculation accuracy:
Verification successful
Rate limit calculations verified as correct
After thorough examination of the source code, I can confirm that the rate limit calculations in the test are accurate. Here's a breakdown of the key points:
- The
inferenceLimit.Limit
is set to 100 requests per minute.- The test uses variable durations (1 minute and 5 minutes) to ensure the calculations work correctly for different time spans.
- The
exactLimit
calculation correctly scales the limit based on the test duration:exactLimit := int(inferenceLimit.Limit) * int(tc.testDuration/(time.Duration(inferenceLimit.Duration)*time.Millisecond))This approach ensures that the rate limit is properly adjusted for the test duration, maintaining the correct requests-per-minute ratio. The use of upper and lower limits (120% and 100% of exactLimit, respectively) provides an appropriate tolerance for test variations.
Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the rate limit calculations in the test code. # Expected: The computed limits match the theoretical limits based on duration and rate. # Extract inferenceLimit.Limit and inferenceLimit.Duration LIMIT=$(grep -A3 'inferenceLimit := operations.UpdateIdentityRatelimits' identities_ratelimits_accuracy_test.go | grep 'Limit' | awk '{print $2}' | tr -d ',') DURATION_MS=$(grep -A3 'inferenceLimit := operations.UpdateIdentityRatelimits' identities_ratelimits_accuracy_test.go | grep 'Duration' | awk '{print $2}' | tr -d ',') TEST_DURATION=$(grep -A2 'testDuration' identities_ratelimits_accuracy_test.go | grep 'testDuration' | awk '{print $2}' | tr -d ',') # Calculate expected exactLimit EXACT_LIMIT=$(echo "$LIMIT * ($TEST_DURATION / ($DURATION_MS / 1000))" | bc) echo "Calculated exactLimit: $EXACT_LIMIT" # Verify the calculations in the test code match this value.Length of output: 856
Script:
#!/bin/bash # Search for the correct file name echo "Searching for the test file:" fd -t f -e go ratelimits_accuracy_test # List the content of the directory containing the file (if found) echo -e "\nListing directory content:" fd -t f -e go ratelimits_accuracy_test -x dirname {} | xargs ls -la # Display the relevant parts of the file content echo -e "\nRelevant file content:" fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A5 'inferenceLimit :=' fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A2 'testDuration' # Extract and display the required values echo -e "\nExtracted values:" fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A3 'inferenceLimit :=' | grep 'Limit' | awk '{print "LIMIT=" $2}' | tr -d ',' fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A3 'inferenceLimit :=' | grep 'Duration' | awk '{print "DURATION_MS=" $2}' | tr -d ',' fd -t f -e go ratelimits_accuracy_test -x cat {} | grep -A2 'testDuration' | grep 'testDuration' | awk '{print "TEST_DURATION=" $2}' | tr -d ','Length of output: 2426
apps/agent/pkg/testutil/attack.go
Outdated
func Attack[Response any](t *testing.T, rate Rate, duration time.Duration, fn func() Response) <-chan Response { | ||
t.Log("attacking") | ||
wg := sync.WaitGroup{} | ||
workers := 256 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider making workers
configurable
Currently, the number of worker goroutines is hardcoded to 256
:
workers := 256
To make the function more flexible and adaptable to different testing environments, consider making workers
a parameter or deriving it from the available CPU cores:
workers := runtime.NumCPU()
Don't forget to import the runtime
package:
import (
"fmt"
"runtime"
"sync"
"testing"
"time"
)
if total < lowerLimit { | ||
lowerLimit = total | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Logical error in lowerLimit
adjustment
Adjusting lowerLimit
to total
when total < lowerLimit
may lead to incorrect assertions, potentially masking test failures.
At lines 114-116:
if total < lowerLimit {
lowerLimit = total
}
This adjustment allows passed
to always be greater than or equal to lowerLimit
, which can render the assertion at line 120 ineffective. Instead, consider reviewing the logic to ensure that the lowerLimit
accurately reflects the expected minimum number of successful requests.
For example, you might want to remove the adjustment or revise it based on the test's intent.
attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil" | ||
"github.com/unkeyed/unkey/apps/agent/pkg/uid" | ||
"github.com/unkeyed/unkey/apps/agent/pkg/util" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update import paths to use module references instead of relative paths
The import paths for internal packages use relative paths, which may cause issues with module resolution. Consider updating them to use module references.
Apply this diff to update the import paths:
-attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil"
-"github.com/unkeyed/unkey/apps/agent/pkg/uid"
-"github.com/unkeyed/unkey/apps/agent/pkg/util"
+attack "unkey/apps/agent/pkg/testutil"
+"unkey/apps/agent/pkg/uid"
+"unkey/apps/agent/pkg/util"
Ensure that the module path unkey
is correctly specified in your go.mod
file.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
attack "github.com/unkeyed/unkey/apps/agent/pkg/testutil" | |
"github.com/unkeyed/unkey/apps/agent/pkg/uid" | |
"github.com/unkeyed/unkey/apps/agent/pkg/util" | |
attack "unkey/apps/agent/pkg/testutil" | |
"unkey/apps/agent/pkg/uid" | |
"unkey/apps/agent/pkg/util" |
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go
Outdated
Show resolved
Hide resolved
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go
Outdated
Show resolved
Hide resolved
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go
Outdated
Show resolved
Hide resolved
apps/agent/integration/identities/identities_ratelimits_accuracy_test.go
Outdated
Show resolved
Hide resolved
if (input.recoverEnabled && keyAuth?.storeEncryptedKeys) { | ||
const vault = new Vault(env().AGENT_URL, env().AGENT_TOKEN); | ||
const encryptReq: EncryptRequest = { | ||
keyring: workspace?.id, | ||
data: key, | ||
}; | ||
const requestId = crypto.randomUUID(); | ||
const context: RequestContext = { requestId }; | ||
const vaultRes = await vault.encrypt(context, encryptReq).catch((_err) => { | ||
throw new TRPCError({ | ||
code: "INTERNAL_SERVER_ERROR", | ||
message: "Encryption Failed. Please contact support using [email protected]", | ||
}); | ||
}); | ||
await db | ||
.insert(schema.encryptedKeys) | ||
.values({ | ||
keyId: keyId, | ||
workspaceId: workspace?.id, | ||
encrypted: vaultRes.encrypted, | ||
encryptionKeyId: vaultRes.keyId, | ||
}) | ||
.catch((_err) => { | ||
throw new TRPCError({ | ||
code: "INTERNAL_SERVER_ERROR", | ||
message: | ||
"We are unable to store encrypt the key. Please contact support using [email protected]", | ||
}); | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't resolve the issue of key inserts even if the encryption fails.
You need to handle rollback of the key creation if the encryption fails. You either need to wrap the transaction in the create transaction, or you need to handle the rollback manually. Otherwise an end user will have extra keys for no reasons.
It should be something like:
Happy path:
- Create Key
- Encrypt key
- Return success
Unsuccessful:
- Create Key
- Encyption or insert fails
- Make sure key is removed from the DB
- Return error.
Some reading.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See my note
Done! 👍 |
I have also tested it. Now it won't create extra keys even if the encryption or storing inside encryption fails. |
What does this PR do?
Fixes #2097
This PR introduces the feature to create recoverable keys from the dashboard itself. Adding a toggle recoverable in the dashboard creates keys if enabled encrypts the data and then stores the encrypted data. This also introduces an internal package named
@unkey/vault
which allows the dashboard to interact with the vault.https://www.loom.com/share/27f3a1b58e5a4d478b1510a80f679558?sid=e8faf269-7f80-4b8c-9229-7b55c547ebf2
Type of change
Checklist
Required
pnpm build
pnpm fmt
console.logs
git pull origin main
Appreciated
Summary by CodeRabbit
Summary by CodeRabbit
New Features
recoverEnabled
feature in the key creation form, with informative descriptions about security implications.Bug Fixes
recoverEnabled
feature.