-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
not getting exactly the same embedding for different batchsize #76
Comments
Any more findings on this yet? |
Not from my end |
Most likely something to do with the underlying HF transformers package. It's a lot of finger pointing, but still no resolution at this point unfortunately. |
I'm having the same issue, it tried manipulating other things like order or content of the batch, the only factor that affects this is the batch size. |
Same here. I'm getting a different embedding for different batch_size. The embeddings start to differ from about the 7 decimal point. |
Hi,
I recently discovered that
model.encode
method does not give exactly the same embedding for different batch_size values. However, they're still close when I play with atol (absolute tolerance). Is this an expected behaviour or something buggy?You may find minimal code snippet to replicate the conflicting embeddings:
This prints out the following results:
thanks in advance!
The text was updated successfully, but these errors were encountered: