You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Once our extractions are migrated to STAC api, we need to think about how to launch actual patch-to-point jobs, most likely per ref_id, then further split into per EPSG zone.
The text was updated successfully, but these errors were encountered:
Query STAC API with a property filter equal to a certain EPSG code
From this we can construct a new job_tracker where each job corresponds to a different EPSG code
Get the total bounds of all STAC items returned from the query. With these bounds query the RDM API to get all features intersecting with the bounds and with extract_flag==1 --> Here we need a sampling procedure (just take centroids?)
load_stac the S1 and S2 patch extraction from the API and load pre-composited Agera5 and DEM, apply standard preprocessing and merge
aggregate_spatial on the sampled points and execute the batch job --> To be seen if it's possible in one batch job
Download results as geoparquet and merge to existing geoparquet database of point extractions
If job succesful: mark all extracted features as extracted (using extract_flag?) in RDM, to avoid double extractions
Once our extractions are migrated to STAC api, we need to think about how to launch actual patch-to-point jobs, most likely per ref_id, then further split into per EPSG zone.
The text was updated successfully, but these errors were encountered: