Page 2 of 2

Re: Hardware best practices

Posted: Thu Feb 27, 2020 9:46 am
by torzdf

FWIW, I develop Faceswap on a Linux Box with 8GB of RAM, so it's very unlikely I'll ever push an update which won't run on 8GB


Re: Hardware best practices

Posted: Sun Jun 21, 2020 8:28 pm
by Boogie

Hi, does the video card only affect the speed of the training, or also the quality of the results?

For example, I got a slow card with only 2 gb ram (GTX 1050), but I am in no hurry and willing to leave my computer running for days or even weeks if that's what it takes to get good results. Should I still bother spending my money for a better card?


Re: Hardware best practices

Posted: Mon Jun 22, 2020 9:54 am
by torzdf

It doesn't directly affect the quality of the model, but it does impact the model that can be loaded.

In your example, the 2GB card could load the Lightweight model, and the output would be identical (albeit over a longer time) to the Lightweight model trained on an 11GB card.

However, the 11GB card can train Villain/Dlight etc (which are higher quality models), which the 2GB would not be able to train.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 1:58 am
by ferrafols

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 4:13 am
by bryanlyon
ferrafols wrote: Tue Jul 21, 2020 1:58 am

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.

Faceswap does support multiple HPUs, but it's not automatic. You need to specify the number of gpus to use with the -g option.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 6:15 am
by ferrafols
bryanlyon wrote: Tue Jul 21, 2020 4:13 am
ferrafols wrote: Tue Jul 21, 2020 1:58 am

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.

Faceswap does support multiple HPUs, but it's not automatic. You need to specify the number of gpus to use with the -g option.

Yes I saw the bar on the trainning options but when I set to 2 and when was starting to train, it crashed... the computer turned off completly.. thats why I dont know if with the NVLInk is aleady using the 2 GPUS, or if the GPUS need to be without the NVLInk to set the parameter to 2 GPUs..


Re: Hardware best practices

Posted: Thu Jul 23, 2020 5:27 pm
by bryanlyon
ferrafols wrote: Tue Jul 21, 2020 6:15 am

Yes I saw the bar on the trainning options but when I set to 2 and when was starting to train, it crashed... the computer turned off completly.. thats why I dont know if with the NVLInk is aleady using the 2 GPUS, or if the GPUS need to be without the NVLInk to set the parameter to 2 GPUs..

If your system shut off when you started with both GPUs then you're facing a power supply issue. Your power supply is failing when the sudden draw of 2 GPUs are hitting it. You can search this forum for the options and solutions available for that.


Re: Hardware best practices

Posted: Wed Aug 05, 2020 12:29 am
by abigflea
FWIW
Was looking at RTX prices today.
Cheapest 2060 super 8gb - $399 US
Cheapest 2070 8gb - $410 US