Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Just ensuring uniqueness in the prefs #4072

Merged
merged 1 commit into from
Dec 13, 2024
Merged

Just ensuring uniqueness in the prefs #4072

merged 1 commit into from
Dec 13, 2024

Conversation

Baalmart
Copy link
Contributor

@Baalmart Baalmart commented Dec 13, 2024

Description

just ensuring uniqueness in the prefs

Changes Made

  • just ensuring uniqueness in the prefs

Testing

  • Tested locally
  • Tested against staging environment
  • Relevant tests passed: [List test names]

Affected Services

  • Which services were modified:
    • Auth Service

Endpoints Ready for Testing

  • New endpoints ready for testing:
    • just ensuring uniqueness in the prefs

API Documentation Updated?

  • Yes, API documentation was updated
  • No, API documentation does not need updating

Additional Notes

just ensuring uniqueness in the prefs

Summary by CodeRabbit

  • New Features

    • Introduced a utility function to remove duplicate entries in preference updates.
  • Bug Fixes

    • Enhanced error handling for invalid ObjectId entries during data processing.
  • Refactor

    • Streamlined the logic for handling unique entries in selected arrays, improving readability and maintainability.
    • Updated the approach for updating array fields to ensure uniqueness.

Copy link
Contributor

coderabbitai bot commented Dec 13, 2024

📝 Walkthrough

Walkthrough

The changes in this pull request involve significant modifications to the PreferenceSchema pre-save hook and the introduction of a new utility function in the auth-service. The updates focus on consolidating the handling of selected arrays and ObjectId fields to ensure uniqueness. The previous method of using separate arrays and $addToSet has been replaced with a streamlined approach that employs a filtering mechanism and the $set operator. Error handling remains consistent, ensuring robust validation during the data processing phase.

Changes

File Path Change Summary
src/auth-service/models/Preference.js Updated PreferenceSchema pre-save hook to consolidate array handling, ensuring uniqueness via filtering and using $set instead of $addToSet. Maintained error handling for invalid ObjectId.
src/auth-service/utils/create-preference.js Introduced removeDuplicates utility function to filter duplicates in prepareUpdate. Updated logic to use $set for unique array values instead of $addToSet. Consistent error handling across functions.

Possibly related PRs

Suggested reviewers

  • Codebmk
  • OchiengPaul442

🎉 In the realm of code, where logic flows,
Unique entries bloom, as the pre-save knows.
With filters in place, and errors at bay,
Preferences shine bright, in a streamlined way!
So here’s to the changes, both clever and neat,
In the world of auth, we’ve crafted a treat! 🌟


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Experiment)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

Auth-service changes in this PR available for preview here

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Outside diff range and nitpick comments (3)
src/auth-service/models/Preference.js (2)

258-265: Consider Simplifying Duplicate Removal Logic

The current method for removing duplicates based on _id works but can be simplified for better readability and efficiency. Using a Map to track unique entries may enhance performance.

Here's an alternative approach:

const uniqueArray = [
  ...new Map(
    updateData[field].map((item) => [item._id.toString(), item])
  ).values(),
];

290-301: Simplify Unique ID Processing for Array ID Fields

The logic for ensuring unique ObjectIds in arrayIdFields can be streamlined. Consider using a Set directly on the processed ObjectIds.

Here's a simplified version:

const uniqueIds = Array.from(
  new Set(
    (Array.isArray(updateData[field]) ? updateData[field] : [updateData[field]])
      .map(processObjectId)
      .filter(Boolean)
  )
);
src/auth-service/utils/create-preference.js (1)

73-74: Update Comment to Reflect Code Changes

The comment mentions handling fields that should be "added to set," but the code now uses $set instead of $addToSet. To avoid confusion, update the comment to accurately reflect the current implementation.

Here's the revised comment:

-      // Handle fields that should be added to set (array fields)
+      // Handle fields by setting array values
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8a4d071 and f563aa7.

📒 Files selected for processing (2)
  • src/auth-service/models/Preference.js (2 hunks)
  • src/auth-service/utils/create-preference.js (1 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
src/auth-service/utils/create-preference.js

[error] 116-116: field is assigned to itself.

This is where is assigned.

(lint/correctness/noSelfAssign)

Comment on lines +100 to +106
const uniqueArray =
field === "selected_sites"
? removeDuplicates(processedArray)
: processedArray;

update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ensure Consistent Duplicate Removal Across Fields

Similarly, in the processing of fieldsToUpdate, duplicates are removed only for "selected_sites". To prevent potential duplicate entries and maintain consistency, consider applying the removeDuplicates function to all relevant fields.

Modify the code as follows:

-          const uniqueArray =
-            field === "selected_sites"
-              ? removeDuplicates(processedArray)
-              : processedArray;
+          const uniqueArray = removeDuplicates(processedArray);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const uniqueArray =
field === "selected_sites"
? removeDuplicates(processedArray)
: processedArray;
update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;
const uniqueArray = removeDuplicates(processedArray);
update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;

Comment on lines +74 to +85
const processedArray = Array.isArray(update[field])
? update[field]
: [update[field]];

// Remove duplicates for specific fields
const uniqueArray =
field === "selected_sites"
? removeDuplicates(processedArray)
: processedArray;

update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Inconsistent Use of Duplicate Removal

In the prepareUpdate function, the removeDuplicates function is applied only to "selected_sites" in fieldsToAddToSet. If other fields may contain duplicates, consider applying the duplicate removal to all relevant fields for consistency.

Update the code as follows:

-          const uniqueArray =
-            field === "selected_sites"
-              ? removeDuplicates(processedArray)
-              : processedArray;
+          const uniqueArray = removeDuplicates(processedArray);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const processedArray = Array.isArray(update[field])
? update[field]
: [update[field]];
// Remove duplicates for specific fields
const uniqueArray =
field === "selected_sites"
? removeDuplicates(processedArray)
: processedArray;
update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;
const processedArray = Array.isArray(update[field])
? update[field]
: [update[field]];
// Remove duplicates for specific fields
const uniqueArray = removeDuplicates(processedArray);
update["$set"] = update["$set"] || {};
update["$set"][field] = uniqueArray;

Comment on lines +115 to +116
// Ensure single ObjectId fields are processed as-is
update[field] = update[field];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Remove Redundant Self-Assignment

At line 116, update[field] = update[field]; is assigning update[field] to itself, which is unnecessary and can be omitted.

Apply this diff to remove the redundant code:

-      const singleObjectIdFields = ["user_id", "group_id"];
-      singleObjectIdFields.forEach((field) => {
-        if (update[field]) {
-          // Ensure single ObjectId fields are processed as-is
-          update[field] = update[field];
-        }
-      });

If specific processing is intended for singleObjectIdFields, please implement the necessary logic.

Committable suggestion skipped: line range outside the PR's diff.

🧰 Tools
🪛 Biome (1.9.4)

[error] 116-116: field is assigned to itself.

This is where is assigned.

(lint/correctness/noSelfAssign)

Copy link

codecov bot commented Dec 13, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 11.81%. Comparing base (666c098) to head (f563aa7).
Report is 69 commits behind head on staging.

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff            @@
##           staging    #4072   +/-   ##
========================================
  Coverage    11.81%   11.81%           
========================================
  Files          115      115           
  Lines        15482    15482           
  Branches       319      319           
========================================
  Hits          1829     1829           
  Misses       13653    13653           

@Baalmart Baalmart merged commit 9267ecd into staging Dec 13, 2024
52 checks passed
@Baalmart Baalmart deleted the hf-prefs-update-2 branch December 13, 2024 14:39
@Baalmart Baalmart mentioned this pull request Dec 13, 2024
1 task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant