Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DB logs should have a max-age setting to prevent bloating table size #1494

Closed
timkelty opened this issue Aug 14, 2024 · 6 comments
Closed

DB logs should have a max-age setting to prevent bloating table size #1494

timkelty opened this issue Aug 14, 2024 · 6 comments
Labels

Comments

@timkelty
Copy link
Contributor

Description

File-based logs have rotation, but with the move to db logs, we should have a way to automatically limit the size of that table.

The idea is to add a maxLogAge:

  • use as a default to feed-me/logs/clear
  • if set, clear on craft's garbage collection

#1487 (comment)

@frank-laemmer
Copy link

feedme_logs are often the cause for blown up databases supporting Craft CMS clients (we provide web hosting), usually the biggest table. Customer are often unaware. So plus one for this feature.

@ul8
Copy link

ul8 commented Jan 14, 2025

👍 I just manually freed up 10GB in a database by clearing the logs. Would really appreciate an automated option.

@mnlmaier
Copy link

mnlmaier commented Jan 14, 2025

+1-ing this, came up in one of our projects as well (though not in the realms of 10GB+, that's wild).

Also I was thinking, why should the logs be saved into the database at all, when there is an established structure with log files being stored in storage/logs (as it also is the case in other frameworks, like Laravel)? I'd prefer using the traditional approach, but maybe that's just me. if there is a good reason for storing logs in the database, I'm genuinely curious to learn more about it.

(Edit: I'm now aware that this is configurable, might be interesting to some: https://github.com/craftcms/feed-me/blob/5.9.0/README.md#customizing-logs)

@timkelty
Copy link
Contributor Author

timkelty commented Feb 7, 2025

I'm realizing now that the logs UI (which is sole reason for the DB logs) doesn't currently have pagination and has a hard-coded limit of 300.
The simplest approach for now is probably to just prune that table to most recent 300 on feed processing and garbage collection.

Does that seem reasonable to those having trouble with this? @frank-laemmer @ul8 @mnlmaier

@angrybrad
Copy link
Member

@mnlmaier

Also I was thinking, why should the logs be saved into the database at all, when there is an established structure with log files being stored in storage/logs (as it also is the case in other frameworks, like Laravel)? I'd prefer using the traditional approach, but maybe that's just me.

Same - it's old, legacy behavior that we'll be happy to get rid of for the next major version.

@timkelty
Copy link
Contributor Author

timkelty commented Feb 7, 2025

#1594 has been released with Feed Me 5.10.0 and 6.7.0.
DB logs will be pruned to the 300 that can show in the CP at the start of every feed processing and any garbage collection.

Note: logs still always going to standard logging (with a category of feed-me), unless you choose to filter them out. The feedme_logs table is purely to support the existing CP logs UI.

@timkelty timkelty closed this as completed Feb 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants