You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation of the SBI library contains significant code duplication within the train(...) methods of SNPE, SNRE, and SNLE. These methods share many common functionalities, including:
Building the neural network
Resuming training
Managing the training and validation loops
This redundancy increases the complexity of the codebase, making it harder to maintain and more prone to inconsistencies and bugs, particularly during updates or enhancements.
To address this, we propose refactoring these methods by introducing a unified train function in the base class. This common train function would handle the shared aspects of the training process, while accepting specific losses and other relevant keyword arguments as parameters to handle the differences between SNPE, SNRE, and SNLE.
Identify and abstract the common code segments across the train methods of SNPE, SNRE, and SNLE.
Design a generic train function in the base class that accepts specific losses and other necessary arguments unique to each method. Parts shared by some, but not all methods, should be offloaded into separate class methods that can be overridden by children's classes if required.
Refactor the existing train methods to utilize the new generic function, passing their specific requirements as arguments.
We encourage contributors to discuss strategies for this refactoring and help with the implementation. This effort will improve the library’s maintainability and ensure consistency across its components.
If you identify other areas where significant code duplication can be reduced, please create a new issue (e.g., #921).
The text was updated successfully, but these errors were encountered:
This will become even more relevant when we have a common dataloader interface and agnostic loss functions for all SBI methods. But I am removing the hackathon label for now as it will not be done before the release.
Description:
The current implementation of the SBI library contains significant code duplication within the
train(...)
methods of SNPE, SNRE, and SNLE. These methods share many common functionalities, including:This redundancy increases the complexity of the codebase, making it harder to maintain and more prone to inconsistencies and bugs, particularly during updates or enhancements.
To address this, we propose refactoring these methods by introducing a unified
train
function in the base class. This commontrain
function would handle the shared aspects of the training process, while accepting specific losses and other relevant keyword arguments as parameters to handle the differences between SNPE, SNRE, and SNLE.Example redundancies
sbi/sbi/inference/snpe/snpe_base.py
Lines 340 to 379 in 9e224da
sbi/sbi/inference/snle/snle_base.py
Lines 214 to 244 in 9e224da
sbi/sbi/inference/snre/snre_base.py
Lines 228 to 260 in 9e224da
Proposed Steps
train
methods of SNPE, SNRE, and SNLE.train
function in the base class that accepts specific losses and other necessary arguments unique to each method. Parts shared by some, but not all methods, should be offloaded into separate class methods that can be overridden by children's classes if required.train
methods to utilize the new generic function, passing their specific requirements as arguments.We encourage contributors to discuss strategies for this refactoring and help with the implementation. This effort will improve the library’s maintainability and ensure consistency across its components.
If you identify other areas where significant code duplication can be reduced, please create a new issue (e.g., #921).
The text was updated successfully, but these errors were encountered: