GTX 1650 Super 4Gb - Optimizer savings enabled - train

Talk about Hardware used for Deep Learning
Post Reply
User avatar
centoos
Posts: 2
Joined: Mon Mar 23, 2020 9:43 pm

GTX 1650 Super 4Gb - Optimizer savings enabled - train

Post by centoos » Mon Mar 23, 2020 9:54 pm

Hi,
i'm new to faceswap,
when i try to train models, with "original" trainer, if i don't enable "Optimizer Savings" i receive low memory error, even if i set batch size at 2, if i enable it, i can train up to a batch size of 128.
It sounds very strange, with old card (quadro k4200 4Gb) i trained with batch 16 without memoy options, ok much slower, but worked.
thanks for answers

User avatar
bryanlyon
Site Admin
Posts: 225
Joined: Fri Jul 12, 2019 12:49 am
Answers: 20
Location: San Francisco
Has thanked: 3 times
Been thanked: 65 times
Contact:

Re: GTX 1650 Super 4Gb - Optimizer savings enabled - train

Post by bryanlyon » Mon Mar 23, 2020 9:59 pm

This is probably due to drivers/Windows more than the card. Quadro cards use different drivers and probably reserved less memory for OS use, leaving more memory for Faceswap. Windows will often reserve 1/4-1/2 your total vram. 4gb is really close to the minimum we support, and it's likely that you're just running into that limitation. You can try using Linux or turning the card into a non-display card to prevent Windows from reserving that memory (Requires another card or on chip GPU to run the display). Beyond that, you can use the various memory saving options, but they do cause a slowdown in training.

User avatar
centoos
Posts: 2
Joined: Mon Mar 23, 2020 9:43 pm

Re: GTX 1650 Super 4Gb - Optimizer savings enabled - train

Post by centoos » Mon Mar 23, 2020 10:03 pm

Many thanks, now i’m only testing, probably i will buy a rtx card

Post Reply