-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fixing runtime when upserting prefereces #4069
Conversation
📝 Walkthrough📝 WalkthroughWalkthroughThe changes in this pull request primarily focus on the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Outside diff range and nitpick comments (3)
src/auth-service/utils/create-preference.js (3)
Line range hint
214-228
: Correct usage offilter
instead offilterResponse
.The variable
filterResponse
is not defined in this context. You likely intended to usefilter
, which is defined earlier.Apply this diff to rectify the variable name:
const filter = generateFilter.preferences(request, next); - if (isEmpty(filterResponse) || isEmpty(filterResponse.user_id)) { + if (isEmpty(filter) || isEmpty(filter.user_id)) { next( new HttpError( "Internal Server Error", httpStatus.INTERNAL_SERVER_ERROR, { message: "Unable to obtain the corresponding identifier associated with this preference --- please reach out to [email protected]", } ) ); }
228-228
: EnsureprepareUpdate
is error-free or handle exceptions.While
prepareUpdate
is a custom function, it's prudent to handle any potential errors it might throw to prevent unexpected runtime issues.Consider wrapping the call in a try-catch block or ensuring that
prepareUpdate
handles exceptions internally.
336-362
: Refactor duplicate code inreplace
andupdate
methods.The
replace
method contains code that's nearly identical to theupdate
method, specifically in validation and field definitions. Refactoring this common logic into shared functions will improve maintainability and reduce redundancy.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
src/auth-service/models/Preference.js
(3 hunks)src/auth-service/utils/create-preference.js
(5 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
src/auth-service/utils/create-preference.js
[error] 87-87: field is assigned to itself.
This is where is assigned.
(lint/correctness/noSelfAssign)
🔇 Additional comments (1)
src/auth-service/models/Preference.js (1)
Line range hint 237-243
: Validate empty strings in processObjectId
function.
Adding a check for empty strings in the processObjectId
function enhances robustness by preventing invalid ObjectIds. This ensures that fields with empty strings are set to null
, avoiding potential errors during database operations.
// Validate user and group | ||
const validationError = await validateUserAndGroup( | ||
tenant, | ||
body.user_id, | ||
body.group_id, | ||
next | ||
); | ||
if (validationError) return; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Simplify error handling after validateUserAndGroup
call.
The validateUserAndGroup
function handles errors internally and does not return a value. Therefore, checking validationError
is unnecessary since it will always be undefined
.
Consider removing the unnecessary variable and condition:
- const validationError = await validateUserAndGroup(
+ await validateUserAndGroup(
tenant,
body.user_id,
body.group_id,
next
);
- if (validationError) return;
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// Validate user and group | |
const validationError = await validateUserAndGroup( | |
tenant, | |
body.user_id, | |
body.group_id, | |
next | |
); | |
if (validationError) return; | |
// Validate user and group | |
await validateUserAndGroup( | |
tenant, | |
body.user_id, | |
body.group_id, | |
next | |
); | |
const singleObjectIdFields = ["user_id", "group_id"]; | ||
singleObjectIdFields.forEach((field) => { | ||
if (update[field]) { | ||
// Keep the original value for simple ObjectId fields | ||
update[field] = update[field]; | ||
} | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unnecessary self-assignment of update[field]
.
Assigning update[field]
to itself on line 87 is redundant and has no effect. Removing this line will clean up the code without altering functionality.
Apply this diff to eliminate the redundant assignment:
if (update[field]) {
- // Keep the original value for simple ObjectId fields
- update[field] = update[field];
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const singleObjectIdFields = ["user_id", "group_id"]; | |
singleObjectIdFields.forEach((field) => { | |
if (update[field]) { | |
// Keep the original value for simple ObjectId fields | |
update[field] = update[field]; | |
} | |
}); | |
const singleObjectIdFields = ["user_id", "group_id"]; | |
singleObjectIdFields.forEach((field) => { | |
if (update[field]) { | |
} | |
}); |
🧰 Tools
🪛 Biome (1.9.4)
[error] 87-87: field is assigned to itself.
This is where is assigned.
(lint/correctness/noSelfAssign)
selectedArrayFields.forEach((field) => { | ||
if (updateData[field]) { | ||
updateData[field] = updateData[field].map( | ||
selectedArrayProcessors[field] | ||
); | ||
|
||
// Use $addToSet for selected arrays | ||
updateData.$addToSet = updateData.$addToSet || {}; | ||
updateData.$addToSet[field] = { | ||
$each: updateData[field], | ||
}; | ||
delete updateData[field]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prevent overwriting of $addToSet
in selected arrays processing.
When processing selectedArrayFields
, directly assigning to updateData.$addToSet
may overwrite existing entries. To avoid this, merge new fields into $addToSet
without replacing existing ones.
Modify the code to merge fields:
// Use $addToSet for selected arrays
- updateData.$addToSet = updateData.$addToSet || {};
- updateData.$addToSet[field] = {
+ updateData.$addToSet = {
+ ...updateData.$addToSet,
+ [field]: {
$each: updateData[field],
},
};
delete updateData[field];
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
selectedArrayFields.forEach((field) => { | |
if (updateData[field]) { | |
updateData[field] = updateData[field].map( | |
selectedArrayProcessors[field] | |
); | |
// Use $addToSet for selected arrays | |
updateData.$addToSet = updateData.$addToSet || {}; | |
updateData.$addToSet[field] = { | |
$each: updateData[field], | |
}; | |
delete updateData[field]; | |
selectedArrayFields.forEach((field) => { | |
if (updateData[field]) { | |
updateData[field] = updateData[field].map( | |
selectedArrayProcessors[field] | |
); | |
// Use $addToSet for selected arrays | |
updateData.$addToSet = { | |
...updateData.$addToSet, | |
[field]: { | |
$each: updateData[field], | |
}, | |
}; | |
delete updateData[field]; |
// Process array ID fields | ||
arrayIdFields.forEach((field) => { | ||
if (updateData[field]) { | ||
updateData.$addToSet = updateData.$addToSet || {}; | ||
updateData.$addToSet[field] = { | ||
$each: updateData[field], | ||
$each: Array.isArray(updateData[field]) | ||
? updateData[field].map(processObjectId) | ||
: [processObjectId(updateData[field])], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure $addToSet
merges existing array ID fields.
Similar to the previous issue, when adding array ID fields to $addToSet
, ensure you merge with existing fields to prevent overwriting.
Update the code as follows:
if (updateData[field]) {
- updateData.$addToSet = updateData.$addToSet || {};
- updateData.$addToSet[field] = {
+ updateData.$addToSet = {
+ ...updateData.$addToSet,
+ [field]: {
$each: Array.isArray(updateData[field])
? updateData[field].map(processObjectId)
: [processObjectId(updateData[field])],
},
};
delete updateData[field];
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// Process array ID fields | |
arrayIdFields.forEach((field) => { | |
if (updateData[field]) { | |
updateData.$addToSet = updateData.$addToSet || {}; | |
updateData.$addToSet[field] = { | |
$each: updateData[field], | |
$each: Array.isArray(updateData[field]) | |
? updateData[field].map(processObjectId) | |
: [processObjectId(updateData[field])], | |
// Process array ID fields | |
arrayIdFields.forEach((field) => { | |
if (updateData[field]) { | |
updateData.$addToSet = { | |
...updateData.$addToSet, | |
[field]: { | |
$each: Array.isArray(updateData[field]) | |
? updateData[field].map(processObjectId) | |
: [processObjectId(updateData[field])], | |
}, | |
}; |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## staging #4069 +/- ##
========================================
Coverage 11.81% 11.81%
========================================
Files 115 115
Lines 15482 15482
Branches 319 319
========================================
Hits 1829 1829
Misses 13653 13653 |
Auth-service changes in this PR available for preview here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (2)
src/auth-service/models/Preference.js (2)
Line range hint
237-244
: Enhance ObjectId validation robustness.Consider improving the ObjectId validation with these enhancements:
if (!id) return null; if (id instanceof mongoose.Types.ObjectId) return id; -if (typeof id === "string" && id.trim() === "") return null; +if (typeof id === "string") { + const trimmed = id.trim(); + if (trimmed === "") return null; + if (!ObjectId.isValid(trimmed)) { + throw new Error(`Invalid ObjectId format: ${id}`); + } + return mongoose.Types.ObjectId(trimmed); +} -try { - return mongoose.Types.ObjectId(id); -} catch (error) { - logger.error(`Invalid ObjectId: ${id}`); - throw new Error(`Invalid ObjectId: ${id}`); -} +throw new Error(`Invalid ObjectId type: expected string or ObjectId, got ${typeof id}`);
420-432
: Consider optimizing array field processing.The current implementation processes each array field separately. Consider combining similar operations to reduce code duplication and improve maintainability.
-arrayIdFields.forEach((field) => { - if (updateData[field]) { - updateData.$addToSet = { - ...updateData.$addToSet, - [field]: { - $each: Array.isArray(updateData[field]) - ? updateData[field].map(processObjectId) - : [processObjectId(updateData[field])], - }, - }; - delete updateData[field]; - } -}); +const processArrayFields = (fields) => { + const updates = fields.reduce((acc, field) => { + if (updateData[field]) { + acc[field] = { + $each: Array.isArray(updateData[field]) + ? updateData[field].map(processObjectId) + : [processObjectId(updateData[field])] + }; + delete updateData[field]; + } + return acc; + }, {}); + + if (Object.keys(updates).length > 0) { + updateData.$addToSet = { + ...updateData.$addToSet, + ...updates + }; + } +}; + +processArrayFields(arrayIdFields);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
src/auth-service/models/Preference.js
(3 hunks)src/auth-service/utils/create-preference.js
(6 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- src/auth-service/utils/create-preference.js
🔇 Additional comments (3)
src/auth-service/models/Preference.js (3)
246-270
: Well-structured field categorization!
The separation of fields into singleIdFields
, arrayIdFields
, and selectedArrayFields
improves code organization and maintainability.
409-416
: Fixed: Proper merging of $addToSet
operations.
The implementation now correctly merges $addToSet
operations without overwriting existing entries.
Line range hint 1-24
: Verify impact on existing preferences.
The changes to preference processing logic could affect existing preferences. Let's verify the current usage patterns:
✅ Verification successful
Let me analyze the actual usage patterns and validation logic to better understand the impact of these changes on existing preferences.
No impact on existing preferences detected
Based on the comprehensive analysis of the codebase:
- The changes only add mongoose schema definition and validation
- Existing preference update operations use
findOneAndUpdate
with proper validation - The
periodSchema
is consistently used across preference-related models - The schema changes maintain backward compatibility with existing data structure
The modifications enhance data validation without breaking existing preference processing patterns.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check for existing preferences that might be affected by the new validation
echo "Checking for potential validation issues in existing code..."
# Search for direct ObjectId assignments
rg -l "new ObjectId\(|Types\.ObjectId\(" --type js
# Search for preference update patterns
ast-grep --pattern 'findOneAndUpdate($$$, { $$$: $_ }, $$$)'
# Look for any custom validation logic
rg "validate.*preference" --type js
Length of output: 1501
Script:
#!/bin/bash
# Let's check for preference-related operations more comprehensively
# Check for preference schema usage and validation
rg "PreferenceSchema|periodSchema" -B 2 -A 2 --type js
# Look for preference update operations
rg "updatePreference|update.*preference" -i --type js
# Check for preference-related API endpoints
rg "router\.(post|put|patch).*preference" -i --type js
# Search for preference-related tests
fd "test" -e js -x rg "preference" -i {} \;
Length of output: 6616
Auth-service changes in this PR available for preview here |
Description
fixing runtime when upserting prefereces
Changes Made
Testing
Affected Services
Endpoints Ready for Testing
API Documentation Updated?
Additional Notes
fixing runtime when upserting prefereces
Summary by CodeRabbit
New Features
Bug Fixes
Documentation