feat: add best sparse model flag #3709
Open
+73
−2
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add --use-best-sparse-model flag in ns-process-data
Context 📝
When using the exhaustive method to estimate poses, COLMAP sometimes generates multiple sparse models in
colmap/sparse/. In these cases, model 0 is not always the best and can have far fewer registered images than the other models. Currently, the pipeline refines intrinsics and saves thetransforms.jsonalways based on model 0.Problem⚠️
This behavior can lead to suboptimal results when
model 0has fewer registered images than other sparse reconstructions. While it is technically possible for users to select the best model manually by runningns-process-datatwice with additional parameters, this process is cumbersome and error-prone, especially for new users.Solution 💡
This PR introduces a flag (
--use-best-sparse-model) that allows users to choose whether to use the best sparse model (i.e., the one with the most registered images) for both refining intrinsics and saving thetransforms.json. The flag is enabled by default:To support this, the PR adds a helper class called
BestSparseModel, which:colmap/sparse/with the most registered images.colmap_dirpath to locate the sparse models.get_model()andget_model_path(), reducing code repetition.Usage in the pipeline 🔧
process_data/colmap_utils.py, whenrefine_intrinsicsis enabled:This ensures that the best sparse model is selected for intrinsics refinement.
process_data/colmap_converter_to_nerfstudio_dataset.py, the classColmapConverterToNerfstudioDatasethas the propertyabsolute_colmap_model_pathmodified:This ensures that whenever
colmap_model_pathis requested, it returns the path of the best model when the flag is enabled. For example, in the function_save_transforms:This ensures that the
transforms.jsonis generated based on the best sparse model.Expected Impact 🌟
With the
--use-best-sparse-modelflag enabled by default, the pipeline will now:transforms.jsonbased on the best available reconstruction, instead of always using model 0.False✅.