I have tried gtx1080 8gb, its to little. Will 2 cards tital 16gb be enough?
Want to understand the training process better? Got tips for which model to use and when? This is the place for you
This forum is for discussing tips and understanding the process involved with Training a Faceswap model.
If you have found a bug are having issues with the Training process not working, then you should post in the Training Support forum.
Please mark any answers that fixed your problems so others can find the solutions.
Please give me a reply, I am going to buy a new grafixcard now but I am not sure if its enough with 16gb. 8gb is clearly not enough for what I want to do.
I don't know how much it needs. I do know that it trains fine on an RTX 2080Ti
People on our Discord claim it works on 8GB training on Linux
My word is final
Thanx for reply.
Is it possible it needs 11gb? Would then a gtx1080ti 11gb be enough? I uppgraded from a 1560 4gb to 1080 8gb but I am dissapointed, it is really very much the same limitations. These cards are so expensive I dont want to make the same mistake again.
Honestly, your questions are hard to answer... you will always "need" more VRAM.
However, I have just tested training Dfaker @ 256px on Windows 10 on a GTX 1080 (8GB). I can confirm that it trains just fine with Mixed Precision enabled at batch size of 4 on this card.
My word is final
Thanx, thats just the answer I wanted. Then I just need to reduce the batch size instead of buying a new card, excellent!
Have you tried higher batchsize then 4 with the Dfaker 256pxs and GTX1080 8gb or is this the limmit? I just started with batchsize 4 and it runs nicely, thanx again.