You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
24G I guess, since I run a RTX TITAN 24G with bsz=1, it is full. OOM occurs after setting bsz>=2. By the way, I notice in the paper the authors use V100 (32G as is known to all) for training, but the config states that the bsz is 4. It puzzels me.
24G I guess, since I run a RTX TITAN 24G with bsz=1, it is full. OOM occurs after setting bsz>=2. By the way, I notice in the paper the authors use V100 (32G as is known to all) for training, but the config states that the bsz is 4. It puzzels me.
Amazing job, I wonder the minimum GPU VRAM for training, image size is 512 and batch size is one or two
The text was updated successfully, but these errors were encountered: