-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching for server side toggles #45
Comments
What we have right nowServer-sideApp RouterYou can do it with App Router server-side fetch cache, example here: const definitions = await getDefinitions({
fetchOptions: {
next: { revalidate: 15 },
},
}); Docs: https://nextjs.org/docs/app/building-your-application/data-fetching#caching-data Pages RouterYou can have an API endpoint that will fetch definitions, with cache-control (Docs). Then use this URL as source of definitions. As in middleware example. Client-sideUnleash EdgeIf you need to scale right now, we have Unleash Edge. It's written in Rust, crazy-performant implementation of a concept similar to Unleash Proxy. We have clients experimenting with setting it up on CloudFlare Edge Workers. It doesn't work on Vercel yet. |
PlansClient-sideI'll add an example of how to implement API route "proxying" for frontend API. |
@Tymek thanks for the reply. I'm particularly interested in the I was thinking more of a in-memory cache which would be instant and add no overhead at all. |
You can use caching on SSR. (docs). Downside is that if you are returning definitions from there to the client, you will expose your configuration. You can filter toggles you're interested in I guess. Will including this in the library help? I'm hesitant because resolving feature toggles on frontend (running server-client on frontend) isn't something Unleash supported before Next.js SDK. Correct me if I'm wrong, but I don't think it's possible to store something in-memory in a Serverless/Edge environment. I looked into using KV store, but definitions are too big for that. |
By in memory I don't mean anything fansy, just storing things in a local variable. Not using Serveless/Edge, but I'm pretty sure you can do that. That being said, this would be more for the pages router, where you have a node server running for a long time. |
I don't think that will run as expected on Cloudflare/Vercel etc. I'll experiment with it later this week. And again, if you have any user-land implementations, sharing it will help :) Previous Next.js SDK experiment was using this approach (archived https://github.com/Unleash/next-unleash). I'll look into bringing it back. |
Well, we are using Something similar to this could work for us: let definitions: Promise<ClientFeaturesResponse> | undefined;
export const getDefinitions = async (): Promise<ClientFeaturesResponse> => {
if (!definitions) {
definitions = loadDefinitions();
setInterval(async () => {
const updated = await loadDefinitions();
definitions = Promise.resolve(updated);
}, 15000);
}
return definitions;
}; |
Ok. Thank you for explaining 👍🏻 |
I had similar thoughts to @Meemaw when trying to switch to this package for the first time just recently. I guess the biggest question I have is: why doesn't I suppose maybe that wouldn't have made sense if the original motivation of this package was to support Serverless/Edge, but it's certainly worth supporting as an additional use-case, considering how simple it ought to be to just wrap up |
Describe the feature request
It seems the current recommended approaches to get toggles with SSR/GSSP both use an un-cached fetch to the API. Since the definitions rarely change, I think it doesn't make much sense to fetch definitions for every request, but instead keep an unleash client instance around and re-fetch definitions every n seconds (in the background).
With this, we can then just do the toggles evaluation for each request.
Background
No response
Solution suggestions
Its fairly easy to implement something like this in user-land, but I think it would make sense to document/build this into the package itself for ease of usage.
The text was updated successfully, but these errors were encountered: