Deterministic DataLoader on DDP reads same data on all subprocesses #6350
-
I've used What is the way to fix this? Shouldn't the DistributedSampler for DDP automatically get a seed based on the subprocess that it is forked on? (Similar to DataLoader's |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Update: This seemed to be the issue if only settings seeds and setting Pytorch and Cuda for deterministic execution, but adding |
Beta Was this translation helpful? Give feedback.
Update: This seemed to be the issue if only settings seeds and setting Pytorch and Cuda for deterministic execution, but adding
deterministirc=True
to PL's Trainer object seems to have resolved the issue.