Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AA User I can run intensive mutations without getting an error #6025

Open
Weiko opened this issue Jun 25, 2024 · 2 comments
Open

AA User I can run intensive mutations without getting an error #6025

Weiko opened this issue Jun 25, 2024 · 2 comments

Comments

@Weiko
Copy link
Member

Weiko commented Jun 25, 2024

Scope & Context

Currently we have a limitation on the API - MUTATION_MAXIMUM_RECORD_AFFECTED, defined as an env variable - which specifies the number of records that can be affected during a GraphQL mutation (most specifically update and deletions mutations, creations are not concerned for some reason).
This limitation comes from pg_graphql limitation, see "atMost" in the documentation here https://supabase.github.io/pg_graphql/api. If it's not specified, it falls back to the default value (60).
This is also a good practice to avoid overloading the server and in this case the DB. This also tells the user "We handle this much so you should expect an acceptable timeframe for your request, more of it might result in degraded performances/timeout."

Current behavior

For example, when we try to delete more than {MUTATION_MAXIMUM_RECORD_AFFECTED} records, we get an error, which is fine from the API point of view but the Frontend should handle this properly, this is even more an issue UX wise when people want to check the box at the top of the table which implies that it should select all records and mutation affects all of them (even those that are not loaded yet).

Expected behavior

We still want to keep a limitation on the API but we should handle intensive mutations on the client with some strategies:

  • The FE should run multiple mutations in batch. Which means we should call sequentially each mutation (following the limitation of
    MUTATION_MAXIMUM_RECORD_AFFECTED). (e.g: MUTATION_MAXIMUM_RECORD_AFFECTED=160 means we will paginate our mutations with 160 records per page until all records have been affected). The UI should reflect this, similarly to the csv export, with pagination/percent view.
  • Above a certain threshold, we need to find another solution and tell the user that they should defer their action because it's too intensive => Here we want to run an async job in the background. This should be well represented on the UI

Technical inputs

  • To know MUTATION_MAXIMUM_RECORD_AFFECTED, we need to expose this value in getClientConfig endpoint so the frontend can use it
  • ...
Copy link

greptile-apps bot commented Jun 25, 2024

Implementation Steps

  1. Expose MUTATION_MAXIMUM_RECORD_AFFECTED in getClientConfig endpoint:

    • File: api/src/config/config.service.ts
    • Add MUTATION_MAXIMUM_RECORD_AFFECTED to the configuration object returned by getClientConfig.
    getClientConfig() {
      return {
        ...
        MUTATION_MAXIMUM_RECORD_AFFECTED: process.env.MUTATION_MAXIMUM_RECORD_AFFECTED || 60,
      };
    }
  2. Modify frontend to handle batching of mutations:

    • File: frontend/src/hooks/useMutations.ts
    • Update the mutation logic to batch requests based on MUTATION_MAXIMUM_RECORD_AFFECTED.
    const batchMutations = async (records) => {
      const limit = config.MUTATION_MAXIMUM_RECORD_AFFECTED;
      for (let i = 0; i < records.length; i += limit) {
        const batch = records.slice(i, i + limit);
        await mutateBatch(batch); // Assume mutateBatch is a function that handles the mutation
      }
    };
  3. Handle UI updates for batching and async jobs:

    • File: frontend/src/components/MutationProgress.tsx
    • Implement a progress bar or pagination view to reflect the mutation progress.
    const MutationProgress = ({ total, completed }) => (
      <ProgressBar now={(completed / total) * 100} />
    );
    • Integrate MutationProgress component in the relevant UI parts where mutations are triggered.
  4. Handle async job initiation for intensive mutations:

    • File: frontend/src/hooks/useMutations.ts
    • Add logic to initiate an async job if the number of records exceeds a certain threshold.
    const handleIntensiveMutation = async (records) => {
      if (records.length > INTENSIVE_MUTATION_THRESHOLD) {
        await initiateAsyncJob(records); // Assume initiateAsyncJob is a function that handles async job initiation
      } else {
        await batchMutations(records);
      }
    };

References

@Weiko
Copy link
Member Author

Weiko commented Jun 25, 2024

See #6023 as a "temporary" solution.

Weiko added a commit that referenced this issue Jun 26, 2024
We are exposing the server mutationMaximumRecordAffected value via
clientConfig so it can be used by the FE.

This is a first step for #6025

<img width="610" alt="Screenshot 2024-06-26 at 14 58 26"
src="https://github.com/twentyhq/twenty/assets/1834158/9d192976-fd22-45d2-bdaa-a8ff6bb90ca2">
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🆕 New
Development

No branches or pull requests

1 participant