Possible to turn Fc off?
Is it possible to only have Gblock as an Fc and turn the regular Fc off?
If Shared Fc is set to None a Split Fc is created by default I noticed.
The place to discuss Faceswap and Deepfakes
https://forum.faceswap.dev/
Is it possible to only have Gblock as an Fc and turn the regular Fc off?
If Shared Fc is set to None a Split Fc is created by default I noticed.
Probably not... I can't remember how I built it. I would need to look into the code to see if that would be possible. For now though, if you are unable to disable the fully-connected layers entirely, then I would say no.
However, you may be able to simulate disabling the fully connected layers by setting them to a depth of "0". Again, could be wrong. Not tested.
The settings are a bit confusing tbh.
If I set Shared Fc to none but leave Split Fc unchecked, do I have a Shared Fc or not?
Ok, I just did a quick test generating a model. Not tried training it, but it works....
I don't know your model setup so I just did this based off the default preset.
This will still create the Fully Connected layer, but all it will do is reshape the bottleneck output into something the encoder can take (i.e. in the example above, will turn the node from 1024
to 1x1x1024
so that the decoder can accept is as an input.
Like I say, not tested training, just that it will generate.
Model summary:
Code: Select all
____________________________________________________________________________________________________
Model: "phaze_a"
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
face_in_a (InputLayer) [(None, 64, 64, 3)] 0 []
face_in_b (InputLayer) [(None, 64, 64, 3)] 0 []
encoder (Functional) (None, 1024) 33992960 ['face_in_a[0][0]',
'face_in_b[0][0]']
fc_a (Functional) (None, 1, 1, 1024) 0 ['encoder[0][0]']
fc_gblock (Functional) (None, 512) 1050112 ['encoder[0][0]',
'encoder[1][0]']
fc_b (Functional) (None, 1, 1, 1024) 0 ['encoder[1][0]']
g_block_both (Functional) (None, 1, 1, 1024) 23864832 ['fc_a[0][0]',
'fc_gblock[0][0]',
'fc_b[0][0]',
'fc_gblock[1][0]']
decoder_both (Functional) (None, 128, 128, 3) 40944739 ['g_block_both[0][0]',
'g_block_both[1][0]']
====================================================================================================
Total params: 99,852,643
Trainable params: 99,852,643
Non-trainable params: 0
FC Layers:
Code: Select all
____________________________________________________________________________________________________
Model: "fc_b"
____________________________________________________________________________________________________
Layer (type) Output Shape Param #
====================================================================================================
input_3 (InputLayer) [(None, 1024)] 0
reshape_1 (Reshape) (None, 1, 1, 1024) 0
leaky_re_lu_1 (LeakyReLU) (None, 1, 1, 1024) 0
====================================================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0