Page 1 of 1

Freezing B Weights

Posted: Mon Feb 27, 2023 10:11 pm
by MaxHunter

So, I've read the few posts about reusing B side weights. I reuse B side all the time and have never tried to load and freeze the b-side, I've only used the generalized freeze weight option box.

Would it be worth placing a "B-side only" box in the training side of Faceswap for these cases? It seems to me it comes up a lot.

My other question is for clarification:

To reuse my B side weights and continue it's training in Phase A, if I have No Shared FC, Enable G-Block, Split FC, and Split decoders, I would :

Under weights-Load Layers, tick off, FC B, G-Block B, and Decoder B.

Right?

Question 3:

Since I'm reusing B Weights, once my B Weights are at the perfect level (because I've would have been reusing and training b-side over multiple models and instances) I then freeze the b-side while training new A-side models. Correct?

For instance:
Using Steve Buscemi as a B side for three different instances, I finally have him down to an expectable level (we'll say a loss of .01) but now I want to use my .01-Steve Buscemi on a new model of Princess Leia. This is where I would freeze everything pertaining to B-side for the entire training process. Correct?


Re: Freezing B Weights

Posted: Tue Feb 28, 2023 12:11 pm
by torzdf
MaxHunter wrote: Mon Feb 27, 2023 10:11 pm

Would it be worth placing a "B-side only" box in the training side of Faceswap for these cases? It seems to me it comes up a lot.

There is a "B" side only option. Or do you mean an option to auto select layers that only exist for the B side, so you don't have to select individual layers? If so, sure, it's possible, but I'm unlikely to implement, as it's a fair amount of code just to save the end user from having to potentially save themselves from clicking 2 extra checkboxes.

Under weights-Load Layers, tick off, FC B, G-Block B, and Decoder B.

Right?

I assume you mean "check the boxes", if so. Yes.

Since I'm reusing B Weights, once my B Weights are at the perfect level (because I've would have been reusing and training b-side over multiple models and instances) I then freeze the b-side while training new A-side models. Correct?

For instance:
Using Steve Buscemi as a B side for three different instances, I finally have him down to an expectable level (we'll say a loss of .01) but now I want to use my .01-Steve Buscemi on a new model of Princess Leia. This is where I would freeze everything pertaining to B-side for the entire training process. Correct?

Yes, sounds fine.