Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

automatically archive old blueprints and inventory collections #7278

Open
davepacheco opened this issue Dec 18, 2024 · 3 comments
Open

automatically archive old blueprints and inventory collections #7278

davepacheco opened this issue Dec 18, 2024 · 3 comments

Comments

@davepacheco
Copy link
Collaborator

davepacheco commented Dec 18, 2024

tl;dr: I propose that we:

  • Update our manual software update procedure to use omdb db reconfigurator-save before mupdate for every release. Maybe we could store these into a debug dataset on the Scrimlet, sort of like a log file? (Maybe we could just put them into a directory that already gets archived for log files?)
  • When doing automated update, do the equivalent thing: serialize all the Reconfigurator state as JSON and store it into a debug dataset that gets archived.
  • Optionally include these in support bundles, similar to whatever we're going to do with log files and core files.
  • Consider: at the start of each update, both during manual mupdates and during automated update: delete all blueprints older than the current target or whose parent blueprint no longer exists.

Why: when a blueprint is replaced as the current target, it's no longer useful to the system. But it never gets deleted unless an operator explicitly does so. This wouldn't be a big deal (it's true for many other database records, too) except keeping old blueprints around in CockroachDB makes it much harder for us to evolve the system because the schema needs to be able to represent those old blueprints. For example, if we want to add some new blueprint field in release N and it's always filled in by release N+1, if we were still keeping around blueprints from release N-1, we have to represent blueprints that don't have the field set, even though that's illegal in the software today. The same also applies to inventory collections, but they get deleted automatically after a few minutes.

At the same time, historical blueprints and collections can be useful for understanding how a system has changed over time. In debugging tricky path-dependent problems in production systems we might well want to go look at very old blueprints and collections.

Also:

  • We always need to be able to read the current target blueprint. When we come up after an update, the target blueprint was written by a previous version of the software. So we're still somewhat constrained here, but we only have to support blueprints going back one release.
  • It would be nice to always be able to read the last inventory collected by the previous release, though it might also be reasonable to say that we'll just always collect a new one after the update anyway.
  • We also need to figure out what to do with blueprints that were never made the target. We could remove them if their parent is removed because that means they can never be made the target again?

With the proposal above:

  • we should have a historical record of all blueprints and many collections
  • the only blueprint we necessarily keep across an upgrade is the target from before the upgrade
  • we do need to keep supporting older blueprints for one release
  • we might want to do that for inventory collections, too
  • we never need to support reading blueprints or inventory collections from more than one release ago
  • by using reconfigurator-save, we'll wind up with a bunch of other related useful state (e.g., inventory collections that went into some of those blueprints)
@jgallagher
Copy link
Contributor

  • Consider: at the start of each update, both during manual mupdates and during automated update: delete all blueprints older than the current target or whose parent blueprint no longer exists.

I think this could happen after updates too, right? If I have a system on release N that has 10 blueprints, 1 of which is the current target, and I'm going to update to release N+1, I have to still be able to read the existing current target, so presumably I can still read the other 9 too. Once I've made a new blueprint from release N+1, then I can delete all 10 of the old ones in one go.

I'm not sure there's a meaningful technical difference between these, but "don't delete old stuff until we're running new stuff" and "delete all the old stuff at once instead of deleting most of it before the upgrade and the last of it after" both seem appealing.

@jgallagher
Copy link
Contributor

Update our manual software update procedure to use omdb db reconfigurator-save before mupdate for every release. Maybe we could store these into a debug dataset on the Scrimlet, sort of like a log file? (Maybe we could just put them into a directory that already gets archived for log files?)

Hah, sorry for kinda making the same comment twice, but: I think this would be more valuable after the update than before, right? If we do it before, we know what the system looked like before the upgrade, but for any ongoing work for release N+2, it's much more useful to know what the system looked like after the upgrade. Maybe the first time we do this we collect both? Or if it's not too onerous, collect both every time?

@davepacheco davepacheco changed the title automatically archive old blueprints automatically archive old blueprints and inventory collections Dec 19, 2024
@davepacheco
Copy link
Collaborator Author

Yeah, maybe doing it both before and after is best. Doing it before feels like it gives us a bit of a safety net if anything during or immediately after the upgrade goes wrong. But I can see the appeal of having that information from after, too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants