Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Dec 20, 2024
1 parent bb458ee commit 8c97ce8
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -604,7 +604,8 @@ def get_attention_backend(
use_fused_attention = False
elif cudnn_version >= (9, 6, 0) and qkv_format == "thd":
logger.debug(
"Disabling FusedAttention as it does not support context parallelism with THD for cuDNN 9.6+"
"Disabling FusedAttention as it does not support context parallelism with THD for"
" cuDNN 9.6+"
)
use_fused_attention = False

Expand Down

0 comments on commit 8c97ce8

Please sign in to comment.