-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Just ensuring uniqueness in the prefs #4072
Conversation
📝 WalkthroughWalkthroughThe changes in this pull request involve significant modifications to the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Auth-service changes in this PR available for preview here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Outside diff range and nitpick comments (3)
src/auth-service/models/Preference.js (2)
258-265
: Consider Simplifying Duplicate Removal LogicThe current method for removing duplicates based on
_id
works but can be simplified for better readability and efficiency. Using aMap
to track unique entries may enhance performance.Here's an alternative approach:
const uniqueArray = [ ...new Map( updateData[field].map((item) => [item._id.toString(), item]) ).values(), ];
290-301
: Simplify Unique ID Processing for Array ID FieldsThe logic for ensuring unique ObjectIds in
arrayIdFields
can be streamlined. Consider using aSet
directly on the processed ObjectIds.Here's a simplified version:
const uniqueIds = Array.from( new Set( (Array.isArray(updateData[field]) ? updateData[field] : [updateData[field]]) .map(processObjectId) .filter(Boolean) ) );src/auth-service/utils/create-preference.js (1)
73-74
: Update Comment to Reflect Code ChangesThe comment mentions handling fields that should be "added to set," but the code now uses
$set
instead of$addToSet
. To avoid confusion, update the comment to accurately reflect the current implementation.Here's the revised comment:
- // Handle fields that should be added to set (array fields) + // Handle fields by setting array values
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
src/auth-service/models/Preference.js
(2 hunks)src/auth-service/utils/create-preference.js
(1 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
src/auth-service/utils/create-preference.js
[error] 116-116: field is assigned to itself.
This is where is assigned.
(lint/correctness/noSelfAssign)
const uniqueArray = | ||
field === "selected_sites" | ||
? removeDuplicates(processedArray) | ||
: processedArray; | ||
|
||
update["$set"] = update["$set"] || {}; | ||
update["$set"][field] = uniqueArray; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Ensure Consistent Duplicate Removal Across Fields
Similarly, in the processing of fieldsToUpdate
, duplicates are removed only for "selected_sites"
. To prevent potential duplicate entries and maintain consistency, consider applying the removeDuplicates
function to all relevant fields.
Modify the code as follows:
- const uniqueArray =
- field === "selected_sites"
- ? removeDuplicates(processedArray)
- : processedArray;
+ const uniqueArray = removeDuplicates(processedArray);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const uniqueArray = | |
field === "selected_sites" | |
? removeDuplicates(processedArray) | |
: processedArray; | |
update["$set"] = update["$set"] || {}; | |
update["$set"][field] = uniqueArray; | |
const uniqueArray = removeDuplicates(processedArray); | |
update["$set"] = update["$set"] || {}; | |
update["$set"][field] = uniqueArray; |
const processedArray = Array.isArray(update[field]) | ||
? update[field] | ||
: [update[field]]; | ||
|
||
// Remove duplicates for specific fields | ||
const uniqueArray = | ||
field === "selected_sites" | ||
? removeDuplicates(processedArray) | ||
: processedArray; | ||
|
||
update["$set"] = update["$set"] || {}; | ||
update["$set"][field] = uniqueArray; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Inconsistent Use of Duplicate Removal
In the prepareUpdate
function, the removeDuplicates
function is applied only to "selected_sites"
in fieldsToAddToSet
. If other fields may contain duplicates, consider applying the duplicate removal to all relevant fields for consistency.
Update the code as follows:
- const uniqueArray =
- field === "selected_sites"
- ? removeDuplicates(processedArray)
- : processedArray;
+ const uniqueArray = removeDuplicates(processedArray);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const processedArray = Array.isArray(update[field]) | |
? update[field] | |
: [update[field]]; | |
// Remove duplicates for specific fields | |
const uniqueArray = | |
field === "selected_sites" | |
? removeDuplicates(processedArray) | |
: processedArray; | |
update["$set"] = update["$set"] || {}; | |
update["$set"][field] = uniqueArray; | |
const processedArray = Array.isArray(update[field]) | |
? update[field] | |
: [update[field]]; | |
// Remove duplicates for specific fields | |
const uniqueArray = removeDuplicates(processedArray); | |
update["$set"] = update["$set"] || {}; | |
update["$set"][field] = uniqueArray; |
// Ensure single ObjectId fields are processed as-is | ||
update[field] = update[field]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove Redundant Self-Assignment
At line 116, update[field] = update[field];
is assigning update[field]
to itself, which is unnecessary and can be omitted.
Apply this diff to remove the redundant code:
- const singleObjectIdFields = ["user_id", "group_id"];
- singleObjectIdFields.forEach((field) => {
- if (update[field]) {
- // Ensure single ObjectId fields are processed as-is
- update[field] = update[field];
- }
- });
If specific processing is intended for singleObjectIdFields
, please implement the necessary logic.
Committable suggestion skipped: line range outside the PR's diff.
🧰 Tools
🪛 Biome (1.9.4)
[error] 116-116: field is assigned to itself.
This is where is assigned.
(lint/correctness/noSelfAssign)
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## staging #4072 +/- ##
========================================
Coverage 11.81% 11.81%
========================================
Files 115 115
Lines 15482 15482
Branches 319 319
========================================
Hits 1829 1829
Misses 13653 13653 |
Description
just ensuring uniqueness in the prefs
Changes Made
Testing
Affected Services
Endpoints Ready for Testing
API Documentation Updated?
Additional Notes
just ensuring uniqueness in the prefs
Summary by CodeRabbit
New Features
Bug Fixes
Refactor