Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): bump torch from 2.0.0 to 2.0.1 #324

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github May 15, 2023

Bumps torch from 2.0.0 to 2.0.1.

Release notes

Sourced from torch's releases.

PyTorch 2.0.1 Release, bug fix release

This release is meant to fix the following issues (regressions / silent correctness):

  • Fix _canonical_mask throws warning when bool masks passed as input to TransformerEncoder/TransformerDecoder (#96009, #96286)
  • Fix Embedding bag max_norm=-1 causes leaf Variable that requires grad is being used in an in-place operation #95980
  • Fix type hint for torch.Tensor.grad_fn, which can be a torch.autograd.graph.Node or None. #96804
  • Can’t convert float to int when the input is a scalar np.ndarray. #97696
  • Revisit torch._six.string_classes removal #97863
  • Fix module backward pre-hooks to actually update gradient #97983
  • Fix load_sharded_optimizer_state_dict error on multi node #98063
  • Warn once for TypedStorage deprecation #98777
  • cuDNN V8 API, Fix incorrect use of emplace in the benchmark cache #97838

Torch.compile:

  • Add support for Modules with custom getitem method to torch.compile #97932
  • Fix improper guards with on list variables. #97862
  • Fix Sequential nn module with duplicated submodule #98880

Distributed:

  • Fix distributed_c10d's handling of custom backends #95072
  • Fix MPI backend not properly initialized #98545

NN_frontend:

  • Update Multi-Head Attention's doc string #97046
  • Fix incorrect behavior of is_causal paremeter for torch.nn.TransformerEncoderLayer.forward #97214
  • Fix error for SDPA on sm86 and sm89 hardware #99105
  • Fix nn.MultiheadAttention mask handling #98375

DataLoader:

  • Fix regression for pin_memory recursion when operating on bytes #97737
  • Fix collation logic #97789
  • Fix Ppotentially backwards incompatible change with DataLoader and is_shardable Datapipes #97287

MPS:

  • Fix LayerNorm crash when input is in float16 #96208
  • Add support for cumsum on int64 input #96733
  • Fix issue with setting BatchNorm to non-trainable #98794

Functorch:

  • Fix Segmentation Fault for vmaped function accessing BatchedTensor.data #97237
  • Fix index_select support when dim is negative #97916
  • Improve docs for autograd.Function support #98020
  • Fix Exception thrown when running Migration guide example for jacrev #97746

Releng:

  • Fix Convolutions for CUDA-11.8 wheel builds #99451
  • Fix Import torchaudio + torch.compile crashes on exit #96231
  • Linux aarch64 wheels are missing the mkldnn+acl backend support - pytorch/builder@54931c2
  • Linux aarch64 torchtext 0.15.1 wheels are missing for aarch64_linux platform - pytorch/builder#1375
  • Enable ROCm 5.4.2 manywheel and python 3.11 builds #99552
  • PyTorch cannot be installed at the same time as numpy in a conda env on osx-64 / Python 3.11 #97031
  • Illegal instruction (core dumped) on Raspberry Pi 4.0 8gb - pytorch/builder#1370

Torch.optim:

  • Fix fused AdamW causes NaN loss #95847
  • Fix Fused AdamW has worse loss than Apex and unfused AdamW for fp16/AMP #98620

The release tracker should contain all relevant pull requests related to this release as well as links to related issues

Commits

Dependabot compatibility score

You can trigger a rebase of this PR by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
> **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days.

Bumps [torch](https://github.com/pytorch/pytorch) from 2.0.0 to 2.0.1.
- [Release notes](https://github.com/pytorch/pytorch/releases)
- [Changelog](https://github.com/pytorch/pytorch/blob/main/RELEASE.md)
- [Commits](pytorch/pytorch@v2.0.0...v2.0.1)

---
updated-dependencies:
- dependency-name: torch
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label May 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants