guidance on using multiple PC's simultaneously towards the same project (poor mans render farm)

If training is failing to start, and you are not receiving an error message telling you what to do, tell us about it here


Forum rules

Read the FAQs and search the forum before posting a new topic.

This forum is for reporting errors with the Training process. If you want to get tips, or better understand the Training process, then you should look in the Training Discussion forum.

Please mark any answers that fixed your problems so others can find the solutions.

Post Reply
User avatar
gotomgo
Posts: 1
Joined: Mon May 27, 2024 4:41 pm

guidance on using multiple PC's simultaneously towards the same project (poor mans render farm)

Post by gotomgo »

I have multiple Windows 10 machines, each with a decent video card. I'd like to have them all working together on the same training session.

ChatGPT recommended a few different options:

  • TensorFlow's Distributed Training

  • PyTorch's Distributed Data Parallel (DDP)

  • Horovod

  • Ray's RaySGD

Was hoping to see if anyone else had experience doing this, or if there was a recommended workflow? Any advice/tips/tricks would be greatly appreciated.

User avatar
torzdf
Posts: 2734
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 139 times
Been thanked: 639 times

Re: guidance on using multiple PC's simultaneously towards the same project (poor mans render farm)

Post by torzdf »

This is not currently implemented in Faceswap, so you would need to go into the code.

It is something I may look at in the future though

My word is final

Post Reply