Search found 1 match

by misaka17009
Fri Feb 18, 2022 6:32 pm
Forum: Training Support
Topic: Do I need to adjust batch size when distributed training
Replies: 1
Views: 1702

Do I need to adjust batch size when distributed training

If I have 4 GPUs, do I need to divide batch_size by 4 for getting the same result that came with 1 GPU? I saw the memory allocated to each GPU in distribute mode is the same as when training in single GPU mode. So, I thought the actual batch size with 4 GPU distribution mode is four times in 1 GPU. ...