-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
layer_norm backward problem #40
Comments
Not sure if set |
This error only happens when |
I can reproduce with the following script:
|
I found that this issue is more extensive than I initially thought. Other operators, |
The bwd and fwd_bwd tests for layer_norm failed.
Error string is
RuntimeError: This backward function was compiled with non-empty donated buffers which requires create_graph=False and retain_graph=False. Please keep backward(create_graph=False, retain_graph=False) across all backward() function calls, or set torch._functorch.config.donated_buffer=False to disable donated buffer.
Test Plan:
The text was updated successfully, but these errors were encountered: