Skip to content

Commit

Permalink
Remove SP check for rng tracker
Browse files Browse the repository at this point in the history
Signed-off-by: Kirthi Shankar Sivamani <[email protected]>
  • Loading branch information
ksivaman authored Oct 18, 2024
1 parent 41fe1e5 commit 3bc96e1
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -7446,7 +7446,7 @@ def __init__(
), "The number of attention heads must be divisible by the number of GQA groups!"

self.rng_states_tracker = None
if sequence_parallel or get_rng_state_tracker is None:
if get_rng_state_tracker is None:
attention_dropout_ctx = nullcontext
else:
self.rng_states_tracker = get_rng_state_tracker()
Expand Down

0 comments on commit 3bc96e1

Please sign in to comment.