Skip to content

Conversation

@huguesdk
Copy link
Member

avoid MemoryError when converting product images (in product and website_sale).

fixes #4712.

avoid memory error when converting website_sale images.
avoid memory error when converting product images.
@pedrobaeza
Copy link
Member

I think this impacts in performance, isn't it?

@huguesdk
Copy link
Member Author

maybe, i didn’t profile this. per image, it will make one database query to get the image, then several ones when creating the new attachments, but since reading, resizing and writing image attachments is way slower than doing some queries, i think the impact should be minimal.

@pedrobaeza
Copy link
Member

Not sure... Maybe this can be optional through a system param or similar?

@pedrobaeza
Copy link
Member

Or do it by batches of a size like 500?

# a MemoryError can happen if too many attachments are loaded in
# memory. therefore, we only load one record at the time and flush the
# cache at each iteration.
for attachment_id in attachment_ids:
Copy link
Contributor

@MiquelRForgeFlow MiquelRForgeFlow Nov 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
for attachment_id in attachment_ids:
for i, attachment_id in enumerate(attachment_ids, 1):

Comment on lines 40 to +41
)
# flush and clear cache to avoid to fill up memory and force the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
)
# flush and clear cache to avoid to fill up memory and force the
)
if i % 500 != 0:
continue
# flush and clear cache to avoid to fill up memory and force the

@huguesdk
Copy link
Member Author

iirc, i think the main issue comes from the fact that the smaller-size images are computed when flushing the model. i see in invalidate_cache() (called by chunked()), that only the base model is flushed instead of the one we’re working with. could it be why chunked() wasn’t working in this case? chunked() loads PREFETCH_MAX (which is 1000) records at the time. with images this can quickly add up, although in most cases that would still need under a few gib of memory (but odoo’s default hard limit is 2560 mib).

if we’re doing it in batches of 500 for performance, then it would make sense to also prefetch the records by the same batch size. unfortunately, i don’t have time currently to test this, but feel free to suggest something if you have.

@MiquelRForgeFlow
Copy link
Contributor

I think the batches can be of size 1000.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants