v0.7.0 #385
ValerianRey
started this conversation in
General
v0.7.0
#385
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
⚡ Performance update ⚡
In this release, we updated
torchjd
to remove some of the unnecessary overhead in the internal code. This should lead to small but noticeable performance improvements (up to 10% speed).We have also made
torchjd
more lightweight, by making optional some dependencies that were only used byCAGrad
andNashMTL
(the changelog explains how to keep installing these dependencies).We have also fixed all internal type errors thanks to
mypy
, and we have added apy.typed
file somypy
can be used downstream.Changelog
Changed
CAGrad
andNashMTL
to be optional when installingTorchJD. Users of these aggregators will have to use
pip install torchjd[cagrad]
,pip install torchjd[nash_mtl]
orpip install torchjd[full]
to install TorchJD alongside those dependencies.This should make TorchJD more lightweight.
autojac
package protected. The aggregatorsmust now always be imported via their package (e.g.
from torchjd.aggregation.upgrad import UPGrad
must be changed tofrom torchjd.aggregation import UPGrad
). Thebackward
andmtl_backward
functions must nowalways be imported directly from the
torchjd
package (e.g.from torchjd.autojac.mtl_backward import mtl_backward
must be changed tofrom torchjd import mtl_backward
).nan
,inf
or-inf
values. This check was costly in memory and in time for large matrices so thisshould improve performance. However, if the optimization diverges for some reason (for instance
due to a too large learning rate), the resulting exceptions may come from other sources.
autojac
engine.This should lead to a small performance improvement.
Fixed
CAGrad
,ConFIG
,DualProj
,GradDrop
,IMTLG
,NashMTL
,PCGrad
and
UPGrad
) raise aNonDifferentiableError
whenever one tries to differentiate through them.Before this change, trying to differentiate through them leaded to wrong gradients or unclear
errors.
Added
py.typed
file in the top package oftorchjd
to ensure compliance withPEP 561. This should make it possible for users to use
mypy against the type annotations provided in
torchjd
.This discussion was created from the release v0.7.0.
Beta Was this translation helpful? Give feedback.
All reactions