LPIPS loss function

Want to understand the training process better? Got tips for which model to use and when? This is the place for you


Forum rules

Read the FAQs and search the forum before posting a new topic.

This forum is for discussing tips and understanding the process involved with Training a Faceswap model.

If you have found a bug are having issues with the Training process not working, then you should post in the Training Support forum.

Please mark any answers that fixed your problems so others can find the solutions.

Locked
User avatar
rbanfield82
Posts: 3
Joined: Fri Aug 11, 2023 12:33 pm
Has thanked: 4 times

LPIPS loss function

Post by rbanfield82 »

I was using a hybrid of yours and Icarus' phaze-a setup/suggestions that I found in the Phaze-A post. I used Torzdf's Loss Function setup:

SSIM - main function
MAE - 25% - secondary L1 Reg Term
LPIPS-Alex 50%
FFL 100%

I loved this configuration and have been using it for the past 6 months. However, since the update, I've been getting a lot of NaNs, OOM errors, and loss of speed. I read in a recent post about how the above Loss Function setup drastically slowed down training and I confirmed a 90% slowdown on my system (I think FFL@100 is the culprit). I didn't feel like rolling back the CudaToolKit to restore previous speeds as Torzdf suggested, i'm always moving ahead! I'm looking forward to testing out your suggestion.

Thanks for this info, and thanks Faceswap team!

User avatar
torzdf
Posts: 2687
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 135 times
Been thanked: 628 times

Re: Yah new sutff, a vision transformer CLipV

Post by torzdf »

rbanfield82 wrote: Fri Aug 11, 2023 9:46 pm

I didn't feel like rolling back the CudaToolKit to restore previous speeds as Torzdf suggested, i'm always moving ahead! I'm looking forward to testing out your suggestion.

Honestly, that's a false dichotomy. Almost definitely your NaN issues will be resolved by rolling back Cuda/cuDNN.

Basically in yours (and mine, as it happens) use case, there is all downside to being on later libs, and all upside to reverting.

My word is final

User avatar
Ryzen1988
Posts: 57
Joined: Thu Aug 11, 2022 8:31 am
Location: Netherlands
Has thanked: 8 times
Been thanked: 28 times

Re: Yah new sutff, a vision transformer CLipV

Post by Ryzen1988 »

rbanfield82 wrote: Fri Aug 11, 2023 9:46 pm

LPIPS-Alex 50%

I don't know how you do this, i know there are papers telling about the efficiency of LPIPS, but everytime i use it always goes to ugly mosaic patterns/ strangs eyes and strange faces.
Everytime i try, in the end i keep lowering it until its off :shock:

Last edited by Ryzen1988 on Sat Aug 12, 2023 10:37 pm, edited 2 times in total.
User avatar
MaxHunter
Posts: 194
Joined: Thu May 26, 2022 6:02 am
Has thanked: 177 times
Been thanked: 13 times

Re: Yah new sutff, a vision transformer CLipV

Post by MaxHunter »

@Ryzen1988 I'm starting to lose my love for LPIPS as well. I'm noticing the farther I go in training the more the moire pattern appears. I recently lowered it to 20 from 25 with my recent model, but on my next model I'm thinking of cutting it out completely until the no warp phase, and then add it in to see if it'll add more detail.

Last edited by MaxHunter on Sun Aug 13, 2023 2:58 pm, edited 1 time in total.
User avatar
Ryzen1988
Posts: 57
Joined: Thu Aug 11, 2022 8:31 am
Location: Netherlands
Has thanked: 8 times
Been thanked: 28 times

Re: Yah new sutff, a vision transformer CLipV

Post by Ryzen1988 »

MaxHunter wrote: Sun Aug 13, 2023 2:57 pm

@Ryzen1988 I'm starting to lose my love for LPIPS as well. I'm noticing the farther I go in training the more the moire pattern appears. I recently lowered it to 20 from 25 with my recent model, but on my next model I'm thinking of cutting it out completely until the no warp phase, and then add it in to see if it'll add more detail.

Interesting, i have not thought about this option to try it without warping, but i can't imagine this will influence the moire patterns.
But interesting idea to try when disable warp.

User avatar
torzdf
Posts: 2687
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 135 times
Been thanked: 628 times

Re: LPIPS loss function

Post by torzdf »

FWIW, I still always use LPIPS. If I can spare the VRAM, I'll use LPIPS-VGG16, it performs better, and appears to introduce less artefacting, so can run at higher values.

I don't generally get the issue that others see, it generally trains out as it goes, but LPIPS values really can't be high. I also find that artefacting appears to be more prominent the lower the output resolution.

My word is final

User avatar
Ryzen1988
Posts: 57
Joined: Thu Aug 11, 2022 8:31 am
Location: Netherlands
Has thanked: 8 times
Been thanked: 28 times

Re: LPIPS loss function

Post by Ryzen1988 »

what other loss functions do you use with LPIPS?

User avatar
Barnuble
Posts: 17
Joined: Tue Jan 19, 2021 2:42 pm
Been thanked: 1 time

Re: Yah new sutff, a vision transformer CLipV

Post by Barnuble »

MaxHunter wrote: Sun Aug 13, 2023 2:57 pm

@Ryzen1988 I'm starting to lose my love for LPIPS as well. I'm noticing the farther I go in training the more the moire pattern appears. I recently lowered it to 20 from 25 with my recent model, but on my next model I'm thinking of cutting it out completely until the no warp phase, and then add it in to see if it'll add more detail.

Me too ! (moire patterns problems too...)
Since my slowdown problems using LPIPS + FFL,, I use SSIM + Logcosh... and it seems to work fine.

User avatar
Ryzen1988
Posts: 57
Joined: Thu Aug 11, 2022 8:31 am
Location: Netherlands
Has thanked: 8 times
Been thanked: 28 times

Re: LPIPS loss function

Post by Ryzen1988 »

I have noticed FFL gives a real slowdown penalty, it also bottlenecks gpu utilization it seems, dropping from mid 90% to mid 70%.

Locked