Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash when trying to convert a vpc into a raster with raster_tin #31

Open
flemmens opened this issue Dec 16, 2023 · 3 comments
Open

Crash when trying to convert a vpc into a raster with raster_tin #31

flemmens opened this issue Dec 16, 2023 · 3 comments

Comments

@flemmens
Copy link

Hello,

I would like to report an issue:

I have a bunch of .laz files which I would like to convert into a raster.

To begin I create a vpc, which works fine:

pdal_wrench build_vpc --output=myterrain.vpc --input-file-list=my-laz-tiles.txt

Then I want to convert the vpc into a raster tif using the raster_tin method:

pdal_wrench to_raster_tin --resolution=0.2 --input=myterrain.vpc --output=myterrain.tif

If I have only a few .laz files in my-laz-tiles.txt, everything works well.

However, if I have more files (let's say 100+), the process starts and begins to create the temporary directory, but after a while it crashes

0...Killed

I tried with different files on several computers, it always crash after a while if there are too many files in the vpc. Sometimes it goes up to 100 but then it stucks creating the final tif (probably because it don't use bigtiff and is over the 4Gb limit of the regular tiffs).

@flemmens
Copy link
Author

Note: when using to_raster instead of to_raster_tin, it usually goes a bit further but eventualy also crash. I noticed that all cpus are used at 100%, which is probably not good as I can't even log on my machine from another terminal..

@wonder-sk
Copy link
Collaborator

hi @flemmens

I think the problem in your case will be that you run out of memory and the pdal_wrench process gets killed. By default, pdal_wrench runs as computation in multiple threads - as many as the number of logical CPUs. The raster export via TIN is quite memory intensive, so try limiting the number of threads with e.g. --threads 4 command line option and see if that helps.

Bigtiff should not be a problem - the raster creation is done by GDAL which will use bigtiff automatically if needed.

As for to_raster also crashing, that can be a different problem. A backtrace of the crash would help...

@flemmens
Copy link
Author

Thank you for your reply @wonder-sk, I was not aware about this --threads option.

I made a new test with a small VLC of 219 laz files (24Gb) and ran the process on a 96Gb RAM computer with 48 cores using only 4 threads.

The process last a lot longer but finaly crashes at 30%, so it still does not work. However, the subtiles created in the temporary directory are OK, I was able to join them using gdal and got a GeoTIFF of about 1/3 of my terrain.

All the .laz used for the test have been checked, I can process them with a classic pdal pipeline without having any issue. If you wish do the test by yourself I can provide a S3 link with the files.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants