Page 1 of 1

Can somebody help me in using Efficientnet in Disney(DNY) models?

Posted: Mon Sep 05, 2022 7:05 am
by vichitra5587

I don't understand the deep learning fundamentals so when opening the Phaze A model, I always get blown away by the options I see.

I have read that Efficientnet v1/v2 are great trainers with high accuracy, faster training & lower model size.
Has somebody here used the Efficientnet v1/v2 with Disney(DNY) models?
If so can you share the settings with me because when I only change the FS trainer of Disney(DNY) 256 model to Efficientnet v2_S ,
my model shows only colorful blocks & no faces.

@torzdf @ianstephens any suggestions on using Efficientnet v1/v2 with Disney(DNY) 256 model.

Please help as I am already in deep love with the Disney(DNY) 256 model because of its low GPU/VRAM usage & incredible results.


Re: Can somebody help me in using Efficientnet in Disney(DNY) models?

Posted: Tue Sep 06, 2022 10:07 am
by torzdf

Not tested, But you can try:

EncoderEnc Scaling
EfficentNetV2_s67
EfficientNetB467

If neither work. Delete the model folder and lower learning rate to 1e-3.5 and try again

(You can test other EfficientNet encoders. To get the encoder scaling divide the 256 (DNY input size) by encoder's output size, multiply by 100 and round up

eg. EffNetV2S has an output size of 384. So
256/384 = 0.666666667.
Result * 100 = 66.66666667
Rounded up = 67%


Re: Can somebody help me in using Efficientnet in Disney(DNY) models?

Posted: Wed Sep 07, 2022 10:45 am
by vichitra5587

Thanks a million for explaining this math.
I am sure this post will help many people who are trying to experiment with Efficientnet encoder.

I applied your EfficientnetV2_S scaling settings on DNY 256 model & it woked straight away.
I always train all my models at learning rate of 3e & so it worked fine on this scaling setting.

From the initial 6k iterations, I think the DNY 256 default model is winning over EfficientnetV2_S.
I am saying this on the basis of the iteration speed, the speed at which it is learning the facial details, GPU usage, temperature & power consumption.
Disney 256's default settings are faster in iterations & in learning facial details with less GPU usage & power consumption.

However 6k iterations are nothing for getting into a conclusion so I will let the DNY256 model train in both default & EfficientnetV2_S settings
for at least some 500k-600k iterations & then update the results here with some high resolution photo face swapping example.


Re: Can somebody help me in using Efficientnet in Disney(DNY) models?

Posted: Wed Oct 05, 2022 11:59 pm
by MaxHunter

I want to piggy-back off this post:
@torzdf
What are the ramifications of using the DNY512 with the efficientnet v2, even though higher resolutions aren't supported? Will it just be slower because of the higher resolution? It's been spotty when I've tried, but maybe I'm just not experienced enough to know how to make it work.