Want to understand the training process better? Got tips for which model to use and when? This is the place for you
Forum rules
Read the FAQs and search the forum before posting a new topic.
This forum is for discussing tips and understanding the process involved with Training a Faceswap model.
If you have found a bug are having issues with the Training process not working, then you should post in the Training Support forum.
Please mark any answers that fixed your problems so others can find the solutions.
park
Posts: 2 Joined: Tue Jun 02, 2020 2:56 pm
Answers: 0
Post
by park » Tue Jun 02, 2020 3:13 pm
Hello,
I'm a student studying faceswap.
Through [Tip] Training best practices, I learned about pretraining/model.
I want to know about pretraining,
but nobody uses it, do there is no guide.
So if you are okay,
can you tell me how to use pretraining?
torzdf
Posts: 2796 Joined: Fri Jul 12, 2019 12:53 am
Answers: 160
Has thanked: 142 times
Been thanked: 650 times
Post
by torzdf » Wed Jun 03, 2020 9:21 am
I'm personally of the opinion that pre-training offers little to no benefit.
The basic concept is to start training a model on lots of random faces so it learns what a face is, prior to moving over to your actual training set.
park
Posts: 2 Joined: Tue Jun 02, 2020 2:56 pm
Answers: 0
Post
by park » Wed Jun 03, 2020 9:30 am
I know from your post that this method is not good.
But I just want to know about various methods for pretraining.
torzdf
Posts: 2796 Joined: Fri Jul 12, 2019 12:53 am
Answers: 160
Has thanked: 142 times
Been thanked: 650 times
Post
by torzdf » Wed Jun 03, 2020 9:45 am
Unfortunately I'm not the person to ask. Hopefully [mention]bryanlyon[/mention], or someone who uses it will be along to offer some tips.
bryanlyon
Site Admin
Posts: 805 Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 224 times
Contact:
Post
by bryanlyon » Wed Jun 03, 2020 6:46 pm
Pretraining is where you train the model with other faces before you try to train it for your original and swap faces. That's all pretraining is, however, we do not recommend it since it damages the model's ability to focus on a single identity swap.