Skip to content

Reduce redundant site API requests during Atomic wait#108624

Closed
paulopmt1 wants to merge 1 commit intotrunkfrom
fix/reduce-redundant-site-requests-during-atomic-wait
Closed

Reduce redundant site API requests during Atomic wait#108624
paulopmt1 wants to merge 1 commit intotrunkfrom
fix/reduce-redundant-site-requests-during-atomic-wait

Conversation

@paulopmt1
Copy link
Copy Markdown
Contributor

Summary

During the entrepreneur setup flow's loading screen (waitForAtomic step), the app aggressively polls GET /rest/v1.2/sites/{id} every 1 second while waiting for the site to become Atomic. This generates 25+ identical requests over ~60 seconds, each returning ~62KB of data — all just to check a single boolean (is_wpcom_atomic).

Additionally, when the processing step mounts, two independent code paths in useSite() fire simultaneous requests for the same site data:

  • SITE_STORE resolverGET /rest/v1.1/sites/{id}?force=wpcom
  • Redux requestSiteGET /rest/v1.2/sites/{id}

This PR addresses both issues:

1. Exponential backoff for waitForLatestSiteData

File: client/landing/stepper/hooks/use-wait-for-atomic.ts

Changed from a fixed 1s polling interval to exponential backoff starting at 3s, doubling each iteration, capped at 15s (3s → 6s → 12s → 15s → 15s...). This reduces requests from ~25 to ~6 with negligible UX impact since the user is on a loading screen anyway. This matches the pattern already used by waitForPluginInstall.

2. Deduplicate dual-fetch in useSite()

File: client/landing/stepper/hooks/use-site.ts

Previously, useSite() triggered two independent fetch paths on mount:

  1. useSelect → SITE_STORE getSite() resolver → v1.1 API request
  2. useEffect → Redux requestSite() → v1.2 API request

Now, the hook waits for the SITE_STORE resolver to complete first. If it succeeds, the site data is synced to the Redux store via receiveSite() — no second network request. If the resolver fails, it falls back to requestSite() as before.

Test plan

  • Go through the entrepreneur setup flow end-to-end
  • Verify the waitForAtomic loading screen completes successfully
  • Monitor network requests — should see ~6 site data requests instead of 25+
  • Verify no duplicate simultaneous site requests on processing step mount
  • Verify site data is available in Redux store after SITE_STORE resolves

🤖 Generated with Claude Code

During the entrepreneur setup flow's loading screen, the app polls
GET /rest/v1.2/sites/{id} every 1 second while waiting for the site
to become Atomic. This generates 25+ identical requests over ~60s,
each returning ~62KB.

Two changes:

1. Use exponential backoff (3s → 6s → 12s, capped at 15s) in
   waitForLatestSiteData instead of a fixed 1s interval. This
   reduces requests from ~25 to ~6 with no UX impact since the
   user is on a loading screen.

2. Eliminate a redundant dual-fetch in useSite(). Previously, both
   the SITE_STORE resolver (v1.1) and a Redux requestSite dispatch
   (v1.2) fired simultaneously on mount, making two requests for
   the same data. Now, useSite waits for the SITE_STORE resolver
   to complete and syncs its result to the Redux store via
   receiveSite, falling back to requestSite only if the resolver
   fails.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@matticbot
Copy link
Copy Markdown
Contributor

This PR modifies the release build for the following Calypso Apps:

For info about this notification, see here: PCYsg-OT6-p2

  • agents-manager
  • help-center
  • notifications
  • wpcom-block-editor

To test WordPress.com changes, run install-plugin.sh $pluginSlug fix/reduce-redundant-site-requests-during-atomic-wait on your sandbox.

@paulopmt1 paulopmt1 force-pushed the fix/reduce-redundant-site-requests-during-atomic-wait branch from bf4c614 to 1f8b38d Compare February 18, 2026 18:02
@paulopmt1 paulopmt1 closed this Mar 26, 2026
@paulopmt1 paulopmt1 deleted the fix/reduce-redundant-site-requests-during-atomic-wait branch March 26, 2026 13:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants