Releases: KevinMusgrave/pytorch-metric-learning
Releases · KevinMusgrave/pytorch-metric-learning
v2.9.0
Features
- Added SmoothAPLoss. Thanks @ir2718!
- Improved SubCenterArcFaceLoss and GenericPairLoss. Thanks @lucamarini22, @marcpaga!
v2.8.1
Fixed some module import issues.
v2.8.0
v2.7.0
v2.6.0
Improvement + small breaking change to DistributedLossWrapper
- Changed the
embargument ofDistributedLossWrapper.forwardtoembeddingsto be consistent with the rest of the library. - Added a warning and early-return when
DistributedLossWrapperis being used in a non-distributed setting. - Thank you @elisim!
v2.5.0
Improvements
Thanks @mkmenta !
v2.4.1
This is identical to v2.4.0, but includes the LICENSE file which was missing from v2.4.0.
v2.4.0
Features
- Added DynamicSoftMarginLoss. See PR #659. Thanks @domenicoMuscill0!
- Added RankedListLoss. See PR #659. Thanks @domenicoMuscill0!
Bug fixes
- Fixed issue where PNPLoss would return NaN when a batch sample had no corresponding positive. See PR #660. Thanks @Puzer and @interestingzhuo!
Tests
- Fixed the test for HistogramLoss to work with PyTorch 2.1. Thanks @GaetanLepage!
v2.3.0
Features
- Added HistogramLoss. See pull request #651. Thanks @domenicoMuscill0!
v2.2.0
Features
- Added ManifoldLoss. See pull request #635. Thanks @domenicoMuscill0!
- Added P2SGradLoss. See pull request #635. Thanks @domenicoMuscill0!
- Added the
symmetricflag to SelfSupervisedLoss. IfTrue, then the embeddings in bothembeddingsandref_embare used as anchors. IfFalse, then only the embeddings inembeddingsare used as anchors. The previous behavior was equivalent tosymmetric=False. Now the default issymmetric=True, because this is usually what is done in self supervised papers (e.g. SimCLR).