-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory when optimizing large rasters even with --no-in-memory option #219
Comments
It looks like you're running out of memory when building overviews. This is handled by rasterio and nothing we have control over. Did you try to use GDAL directly? |
Thanks for your reply. Tried it with GDAL, the COG tif size is 37GB. When served with terracotta, the rendering on map fails intermittently or freezes. |
Can you be more specific? Which command did you use? What does "fails" mean? Do you see any errors or warnings in the logs? |
|
The file size shouldn't be a problem per se, we've served files in that ballpark without issues. Right, @j08lue ? I'm not sure how smart the new GDAL is. Can you run it through a COG validator to make sure that it's a valid COG? And can you post the output of |
Yes, I checked wit rio-cogeo library and gdal's vaidation script. Both confirm that it is a valid COG.
|
That looks mostly OK to me, the only smaller issue is You do ingest the raster before serving it, right? And how do you serve? Edit: Another thing is |
In daily use, our images are mostly in the order of a 10th of that size on disk (1-5 GB). But size on disk is one thing with compression and all. If your raster is very sparsely filled or has a lot of areas of same value, it will compress very well. From your |
Command used:
Here is the error output after 1.5 hours of the command execution. Size of the output file before the error occurs is ~14GB:
Here is the gdalinfo output of the tif file:
The text was updated successfully, but these errors were encountered: