Search found 8 matches

by Icarus
Sat Jan 07, 2023 3:08 am
Forum: Training Discussion
Topic: [Guide] Introducing - Phaze-A
Replies: 89
Views: 191811

Re: [Guide] Introducing - Phaze-A

Sorry, I may have jumped to the wrong conclusion. You're right, this isn't a NaN issue, it's just the model crashing during creation (possibly due to VRAM). ...and @Icarus if you're able to I bet you could raise your epsilon, gain that extra detail, and avoid NaNs if you run multiple smaller cycles....
by Icarus
Fri Dec 16, 2022 2:19 am
Forum: Training Discussion
Topic: [Guide] Introducing - Phaze-A
Replies: 89
Views: 191811

Re: [Guide] Introducing - Phaze-A

I, for the life of me, cannot get a SYM-384 model started. After the first model save (500its) the preview always turns out like this: Screenshot 2022-11-28 at 10.27.49.png I've tried lowering the learning rate as low as 1e-5 and still the same after the first model save (500its). I've also tried s...
by Icarus
Fri Dec 16, 2022 2:14 am
Forum: Training Discussion
Topic: [Guide] Introducing - Phaze-A
Replies: 89
Views: 191811

Re: [Guide] Introducing - Phaze-A

Thats like me driving a toyota and he comes driving up next to me in a monster truck :geek: makes you question things in life :ugeek: This made me smile in a warm and fuzzy kinda way. :) These observations and your observations a few replies up are truly insightful and really made me question a few...
by Icarus
Sat Aug 27, 2022 12:23 am
Forum: Training Discussion
Topic: [Discussion] Notes on Loss functions
Replies: 13
Views: 31985

Re: Notes on Loss functions

LPIPS-Alex 5% - This loss function outputs strong numbers, so it needs to be very low. How low will depend on what you are mixing it with. This function sharpens up the swap more than any other function I've seen. On its own, it is a total disaster zone though! FFL 100% - How much this helps/does n...
by Icarus
Sat Aug 20, 2022 10:35 pm
Forum: Training Discussion
Topic: [Discussion] How to fix Mixed Precision causing NaNs
Replies: 9
Views: 29683

Re: How to fix Mixed Precision causing NaNs

I have to admit, I have fallen badly out of love with Mixed Precision. Lowering the epsilon exponent certainly does help, and it's good to know it can be taken fairly high with no real detrimental effect. I think it has something to do with FP16's representable range. This is what Nvidia has to say...
by Icarus
Thu Aug 18, 2022 6:36 pm
Forum: Training Discussion
Topic: [Discussion] How to fix Mixed Precision causing NaNs
Replies: 9
Views: 29683

[Discussion] How to fix Mixed Precision causing NaNs

Mixed Precission: : Last but not least, Mixed Precision. You love it and you hate it. It does make a huge difference in training speed and VRAM but is the frequent culprit of NaNs. I did some research on Nvidia's website regarding this and I found the holy grail of hidden information that has cured ...
by Icarus
Thu Aug 18, 2022 6:30 pm
Forum: Training Discussion
Topic: [Discussion] Notes on Loss functions
Replies: 13
Views: 31985

[Discussion] Notes on Loss functions

Loss functions: : As it says in the Training Guide, the choice you make here will have an outsized impact on your entire model. I've tried all and a combination of MS_SSIM and MAE (L1) at 100% have produced the best results. The weird quirk with MS_SSIM is whenever I've tried to start a model using ...
by Icarus
Mon Aug 15, 2022 10:13 pm
Forum: Training Discussion
Topic: [Guide] Introducing - Phaze-A
Replies: 89
Views: 191811

Notes on Phaze A model architecture and settings

I've been experimenting with Phaze A for a year now using Nvidia A100 cloud GPUs and have tried a few common and 1 not so common setup and wanted to share some of my notes on how different model architectures effect results. split fc layer, gblock enabled (not split), shared decoders: This is probab...