Skip to content

Fix compute_loss signature compatibility with transformers >= 4.46#155

Closed
ma-kjh wants to merge 11 commits intolocuslab:mainfrom
ma-kjh:ma-kjh-patch-2
Closed

Fix compute_loss signature compatibility with transformers >= 4.46#155
ma-kjh wants to merge 11 commits intolocuslab:mainfrom
ma-kjh:ma-kjh-patch-2

Conversation

@ma-kjh
Copy link
Contributor

@ma-kjh ma-kjh commented Nov 25, 2025

What does this PR do?

Problem

The latest version of transformers (v4.46+) passes a new argument num_items_in_batch to the compute_loss method.
Currently, the custom trainers (GradAscent, etc.) do not accept arbitrary keyword arguments, causing a TypeError when running with recent transformers versions.

Error log:
TypeError: GradAscent.compute_loss() got an unexpected keyword argument 'num_items_in_batch'

Solution

Added **kwargs to the compute_loss method signature in custom trainers to ensure compatibility with the latest transformers library.

Fixes # (issue)
N/A

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Have you gone through the contributions guide?
  • Are your changes documented? Read documentation guidelines here.

@filyp
Copy link

filyp commented Jan 24, 2026

I'm also going to need more recent transformers, so I'm wondering why did you close this PR? Was there something wrong with this fix?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants