Skip to content
This repository has been archived by the owner on May 29, 2024. It is now read-only.

build(deps): bump transformers from 4.35.0 to 4.36.2 #142

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Dec 21, 2023

Bumps transformers from 4.35.0 to 4.36.2.

Release notes

Sourced from transformers's releases.

Patch release: v4.36.2

Patch release to resolve some critical issues relating to the recent cache refactor, flash attention refactor and training in the multi-gpu and multi-node settings:

  • Resolve training bug with PEFT + GC #28031
  • Resolve cache issue when going beyond context window for Mistral/Mixtral FA2 #28037
  • Re-enable passing config to from_pretrained with FA #28043
  • Fix resuming from checkpoint when using FDSP with FULL_STATE_DICT #27891
  • Resolve bug when saving a checkpoint in the multi-node setting #28078

Patch release: v4.36.1

A patch release for critical torch issues mostly:

  • Fix SDPA correctness following torch==2.1.2 regression #27973
  • [Tokenizer Serialization] Fix the broken serialisation #27099
  • Fix bug with rotating checkpoints #28009
  • Hot-fix-mixstral-loss (#27948)

🔥

v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2, AMD ROCm, F.sdpa wide-spread support

New model additions

Mixtral

Mixtral is the new open-source model from Mistral AI announced by the blogpost Mixtral of Experts. The model has been proven to have comparable capabilities to Chat-GPT according to the benchmark results shared on the release blogpost.

The architecture is a sparse Mixture of Experts with Top-2 routing strategy, similar as NllbMoe architecture in transformers. You can use it through AutoModelForCausalLM interface:

>>> import torch
>>> from transformers import AutoModelForCausalLM, AutoTokenizer
>>> model = AutoModelForCausalLM.from_pretrained("mistralai/Mixtral-8x7B", torch_dtype=torch.float16, device_map="auto")
>>> tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-8x7B")
>>> prompt = "My favourite condiment is"
>>> model_inputs = tokenizer([prompt], return_tensors="pt").to(device)
>>> model.to(device)
>>> generated_ids = model.generate(**model_inputs, max_new_tokens=100, do_sample=True)
>>> tokenizer.batch_decode(generated_ids)[0]

The model is compatible with existing optimisation tools such Flash Attention 2, bitsandbytes and PEFT library. The checkpoints are release under mistralai organisation on the Hugging Face Hub.

Llava / BakLlava

... (truncated)

Commits
  • a7cab3c Release: v4.36.2
  • f6d6189 Fix bug for checkpoint saving on multi node training setting (#28078)
  • 64bcf77 fix resuming from ckpt when using FSDP with FULL_STATE_DICT (#27891)
  • 780376f [Modeling / Mixtral] Fix GC + PEFT issues with Mixtral (#28061)
  • 6e4429f [FA-2] Fix fa-2 issue when passing config to from_pretrained (#28043)
  • f33b061 Generate: Mistral/Mixtral FA2 cache fix when going beyond the context window ...
  • d1dec79 [core / modeling] Fix training bug with PEFT + GC (#28031)
  • c48787f fix seamless import
  • bd65410 Release: v4.36.1
  • 6342b9b Fix bug with rotating checkpoints (#28009)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.

Dependabot will merge this PR once CI passes on it, as requested by @bot-anik.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [transformers](https://github.com/huggingface/transformers) from 4.35.0 to 4.36.2.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.35.0...v4.36.2)

---
updated-dependencies:
- dependency-name: transformers
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Dec 21, 2023
@dependabot dependabot bot requested review from amimart and ccamel December 21, 2023 15:08
Copy link

coderabbitai bot commented Dec 21, 2023

Important

Auto Review Skipped

Bot user detected.

To trigger a single review, invoke the @coderabbitai review command.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on X ?


Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • You can reply to a review comment made by CodeRabbit.
  • You can tag CodeRabbit on specific lines of code or files in the PR by tagging @coderabbitai in a comment.
  • You can tag @coderabbitai in a PR comment and ask one-off questions about the PR and the codebase. Use quoted replies to pass the context for follow-up questions.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

Copy link
Member

@bot-anik bot-anik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dependabot merge

Copy link
Contributor Author

dependabot bot commented on behalf of github Dec 21, 2023

One of your CI runs failed on this pull request, so Dependabot won't merge it.

Dependabot will still automatically merge this pull request if you amend it and your tests pass.

Copy link
Contributor Author

dependabot bot commented on behalf of github Jan 22, 2024

Superseded by #150.

@dependabot dependabot bot closed this Jan 22, 2024
@dependabot dependabot bot deleted the dependabot/pip/transformers-4.36.2 branch January 22, 2024 15:46
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant