Hey
I have been trainning at the max batch size of 256 for more than 1 day.
Did not have any crashes.
But I read that " there are slight quality degradations at very large batch sizes".. So should I reduce it? And reduce it to what?
After about 60hours and 90k iterations my faceswap looks like this:
If I continue will it improve much more?
I dont see much progress in the last 10 hours of trainning.
Thanks