Highest quality model to use on a 4GB card?

Talk about Hardware used for Deep Learning


Locked
User avatar
tomlick
Posts: 7
Joined: Sat Feb 08, 2020 2:27 am
Been thanked: 3 times

Highest quality model to use on a 4GB card?

Post by tomlick »

I have an Older workstation laptop with an integrated 4GB quadro. I have used faceswap in the past, before the model refactoring and the addition of dlight. I would Like to train at least 128px if possible but if absolutely needed I'll manage with less. What does the Dlight model entail? what are the Vram requirements? There not very much information available online that I can gather. if Dlight isn't suitable for a 4GB card, I would happily use a different model, and would happily appreciate any suggestion, configurations or pointers.

Last edited by tomlick on Mon Feb 10, 2020 8:08 am, edited 1 time in total.
User avatar
torzdf
Posts: 2651
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 129 times
Been thanked: 622 times

Re: Highest quality model to use on a 4GB card?

Post by torzdf »

Probably dfaker + Memory Saving Gradients.

You may be able to use a heavier model with optimization settings (check the training guide for details on optimization settings).

My word is final

User avatar
tomlick
Posts: 7
Joined: Sat Feb 08, 2020 2:27 am
Been thanked: 3 times

Re: Highest quality model to use on a 4GB card?

Post by tomlick »

I've got dfaker running with the optimizer setting. Thanks. It accepted my 256px images that I previously extracted with their alignments, even though it claims to be a 64in/128out model. should I re-extract or be concerned? the preview looks fine. is there anything else I should enable/tweak? Additionally, what exactly is the difference between dlight and dfaker? I was able to get dlight working fine (with optimizations settings enabled) and it seemed to have faster iterations and progression of detail than dfaker, but I digress to those more apt to know the algorithms behind it. One would think since its based on a dfaker variant it would have to have some improvements or it would be a moot excersize.

any advice or insight is appreciated!

User avatar
blackomen
Posts: 9
Joined: Fri Feb 21, 2020 2:14 pm
Has thanked: 2 times
Been thanked: 1 time

Re: Highest quality model to use on a 4GB card?

Post by blackomen »

I have a GTX 960 MX with 4GB and I was able to get it to train with a BS of 8,size 256,and the first 2 optimizers checked using Dlight. I trained for about 8 hours before my A and B losses fell consistently below 0.02 each on a particular pair of videos

Previously, I was using an older version of faceswap and trained the Lowmem model with a BS of 24. Note that the exact same pair of videos took about 24 hours of training to achieve below 0.02 loss for A and B, 3 times slower than Dlight.

Locked