Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor batch handling update #36596

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

yafshar
Copy link

@yafshar yafshar commented Mar 6, 2025

What does this PR do?

This PR refactors the batch handling update, which can help to improve performance

  • Added get_num_items_in_batches function to calculate the number of items in each batch outside the training loop.
  • Updated the get_batch_samples function to fetch batch samples only in the loop.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

- Added `get_num_items_in_batches` function to calculate the number of items in each batch outside the training loop.
- Updated the `get_batch_samples` function to fetch batch samples only in the loop.
Copy link

github-actions bot commented Mar 6, 2025

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. When it is ready for review, please click the Ready for review button (at the bottom of the PR page).

@github-actions github-actions bot marked this pull request as draft March 6, 2025 20:55
@yafshar yafshar marked this pull request as ready for review March 6, 2025 20:58
@yafshar
Copy link
Author

yafshar commented Mar 6, 2025

@techkang, @SunMarc, @muellerzr would you please review this. These changes have been added to https://github.com/huggingface/optimum-habana/tree/transformers_4_49 and we get better performance in fine tuning for some cases.

@techkang
Copy link
Contributor

techkang commented Mar 7, 2025

Hello, could you explain on why this refactoring improves performance? Maybe the specific PR in optimum-habana?

@regisss
Copy link
Contributor

regisss commented Mar 7, 2025

Hello, could you explain on why this refactoring improves performance? Maybe the specific PR in optimum-habana?

I think it's due to calling .item() at each training step, thus moving tensors from device to host and syncing the code every time while this could be done once before training.

@SunMarc SunMarc requested review from muellerzr and SunMarc March 7, 2025 14:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants