Skip to content

Commit

Permalink
Add workflow to run prediction on lidar-prod optimization dataset
Browse files Browse the repository at this point in the history
  • Loading branch information
leavauchier committed Apr 25, 2024
1 parent 5f876d6 commit 328a7fc
Show file tree
Hide file tree
Showing 2 changed files with 107 additions and 0 deletions.
104 changes: 104 additions & 0 deletions .github/workflows/predict-for-lidar-prod-optimization.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
# Workflow name
name: "Prediction on lidar-prod optimization dataset"

on:
# Run workflow on user request
workflow_dispatch:
inputs:
user:
description: |
Username :
Utilisé pour générer un chemin standard pour les sorties dans le
dossier IA du store (projet-LHD/IA/MYRIA3D-SHARED-WORKSPACE/$USER/$SAMPLING_NAME/)
required: true
sampling_name:
description: |
Sampling name :
Nom du dataset sur lequel le modèle a été entraîné.
Utilisé pour générer un chemin standard pour les sorties dans le
dossier IA du store (projet-LHD/IA/MYRIA3D-SHARED-WORKSPACE/$USER/$SAMPLING_NAME/)
Eg. YYYYMMDD_MonBeauDataset
required: true
model_id:
description: |
Identifiant du modèle :
Il correspond au nom du fichier checkpoint à utiliser pour les prédictions (sans l'extension .qckpt !)
($MODEL_ID.ckpt doit exister dans projet-LHD/IA/MYRIA3D-SHARED-WORKSPACE/$USER/$SAMPLING_NAME/)
Il est aussi utilisé pour générer le dossier de sortie
(projet-LHD/IA/LIDAR-PROD-OPTIMIZATION/$SAMPLING_NAME/$MODEL_ID)
Exemple : YYYMMDD_MonBeauSampling_epochXXX_Myria3Dx.y.z
required: true
predict_config_name:
description: |
Nom du fichier de config de myria3d (fichier .yaml) à utiliser pour la prédiction
(doit exister dans projet-LHD/IA/MYRIA3D-SHARED-WORKSPACE/$USER/$SAMPLING_NAME/)
Exemple: YYYMMDD_MonBeauSampling_epochXXX_Myria3Dx.y.z_predict_config_Vx.y.z.yaml
required: true

jobs:
predict-validation-dataset:
runs-on: self-hosted
env:
OUTPUT_DIR: /var/data/LIDAR-PROD-OPTIMIZATION/${{ github.event.inputs.sampling_name }}/${{ github.event.inputs.model_id }}/
DATA: /var/data/LIDAR-PROD-OPTIMIZATION/20221018_lidar-prod-optimization-on-151-proto/Comparison/
CONFIG_DIR: /var/data/MYRIA3D-SHARED-WORKSPACE/${{ github.event.inputs.user }}/${{ github.event.inputs.sampling_name }}/
BATCH_SIZE: 2

steps:
- name: Log configuration
run: |
echo "Run prediction on lidar-prod optimization datasets (val and test)"
echo "Sampling name: ${{ github.event.inputs.sampling_name }}"
echo "User name: ${{ github.event.inputs.user }}"
echo "Checkpoint name: ${{ github.event.inputs.model_id }}"
echo "Prediction config name: ${{ github.event.inputs.predict_config_name }}"
echo "Output_dir: ${{env.OUTPUT_DIR}}"
echo "Data: ${{env.DATA}}"
echo "Config files dir: ${{env.CONFIG_DIR}}"
- name: Checkout branch
uses: actions/checkout@v4

# See https://github.com/marketplace/actions/setup-micromamba
- name: setup-micromamba
uses: mamba-org/[email protected]
with:
environment-file: environment.yml
environment-name: myria3d # activate the environment
cache-environment-key: environment-myria3d-predict-validation-dataset # create cache for this pipeline only
# Do not restore downloads as they are already stored by micromamba
# cache-downloads-key: downloads-myria3d-predict-validation-dataset
generate-run-shell: true
download-micromamba: false
micromamba-binary-path: /var/data/.local/bin/micromamba
micromamba-root-path: /var/data/micromamba

- name: Run prediction on validation dataset
shell: micromamba-shell {0}
run: >
python run.py
--config-path ${{env.CONFIG_DIR}}
--config-name ${{ github.event.inputs.predict_config_name }}
task.task_name=predict
predict.src_las=${{env.DATA}}/val/*.laz
predict.ckpt_path=${{env.CONFIG_DIR}}${{ github.event.inputs.model_id }}.ckpt
predict.output_dir=${{env.OUTPUT_DIR}}/preds-valset/
predict.interpolator.probas_to_save=[building]
predict.gpus=0
datamodule.batch_size=${{env.BATCH_SIZE}}
datamodule.tile_width=1000
- name: Run prediction on test dataset
shell: micromamba-shell {0}
run: >
python run.py
--config-path ${{env.CONFIG_DIR}}
--config-name ${{ github.event.inputs.predict_config_name }}
task.task_name=predict
predict.src_las=${{env.DATA}}/test/*.laz
predict.ckpt_path=${{env.CONFIG_DIR}}${{ github.event.inputs.model_id }}
predict.output_dir=${{env.OUTPUT_DIR}}/preds-testset/
predict.interpolator.probas_to_save=[building]
predict.gpus=0
datamodule.batch_size=${{env.BATCH_SIZE}}
datamodule.tile_width=1000
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
# CHANGELOG

- Add a github action workflow to run a trained model on the lidar-prod thresholds optimisation dataset
(in order to automate thresholds optimization)

### 3.8.4
- fix: move IoU appropriately to fix wrong device error created by a breaking change in torch when using DDP.

Expand Down

0 comments on commit 328a7fc

Please sign in to comment.