Page 2 of 5

Re: Hardware best practices

Posted: Thu Feb 27, 2020 9:46 am
by torzdf

FWIW, I develop Faceswap on a Linux Box with 8GB of RAM, so it's very unlikely I'll ever push an update which won't run on 8GB


Re: Hardware best practices

Posted: Sun Jun 21, 2020 8:28 pm
by Boogie

Hi, does the video card only affect the speed of the training, or also the quality of the results?

For example, I got a slow card with only 2 gb ram (GTX 1050), but I am in no hurry and willing to leave my computer running for days or even weeks if that's what it takes to get good results. Should I still bother spending my money for a better card?


Re: Hardware best practices

Posted: Mon Jun 22, 2020 9:54 am
by torzdf

It doesn't directly affect the quality of the model, but it does impact the model that can be loaded.

In your example, the 2GB card could load the Lightweight model, and the output would be identical (albeit over a longer time) to the Lightweight model trained on an 11GB card.

However, the 11GB card can train Villain/Dlight etc (which are higher quality models), which the 2GB would not be able to train.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 1:58 am
by ferrafols

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 4:13 am
by bryanlyon
ferrafols wrote: Tue Jul 21, 2020 1:58 am

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.

Faceswap does support multiple GPUs, but it's not automatic. You need to specify the number of gpus to use with the -g option.


Re: Hardware best practices

Posted: Tue Jul 21, 2020 6:15 am
by ferrafols
bryanlyon wrote: Tue Jul 21, 2020 4:13 am
ferrafols wrote: Tue Jul 21, 2020 1:58 am

Hi. I use 2x 2080ti GPU (2 cards) with NVlink for other tasks like rendeing in 3ds max and vray,, and I just star learning faceswap, I have a Question.
Does faceswap automatically recognize the 2 gpus with the nvlink, or is just workng with 1 card?... Maybe is a silly question but I dont know much about technical stuff.

Hope somebody comment.

Faceswap does support multiple HPUs, but it's not automatic. You need to specify the number of gpus to use with the -g option.

Yes I saw the bar on the trainning options but when I set to 2 and when was starting to train, it crashed... the computer turned off completly.. thats why I dont know if with the NVLInk is aleady using the 2 GPUS, or if the GPUS need to be without the NVLInk to set the parameter to 2 GPUs..


Re: Hardware best practices

Posted: Thu Jul 23, 2020 5:27 pm
by bryanlyon
ferrafols wrote: Tue Jul 21, 2020 6:15 am

Yes I saw the bar on the trainning options but when I set to 2 and when was starting to train, it crashed... the computer turned off completly.. thats why I dont know if with the NVLInk is aleady using the 2 GPUS, or if the GPUS need to be without the NVLInk to set the parameter to 2 GPUs..

If your system shut off when you started with both GPUs then you're facing a power supply issue. Your power supply is failing when the sudden draw of 2 GPUs are hitting it. You can search this forum for the options and solutions available for that.


Re: Hardware best practices

Posted: Wed Aug 05, 2020 12:29 am
by abigflea

FWIW
Was looking at RTX prices today.
Cheapest 2060 super 8gb - $399 US
Cheapest 2070 8gb - $410 US


Re: Hardware best practices

Posted: Thu Oct 29, 2020 7:53 pm
by Boogie

Nvidia is far superior to AMD (You may debate this statement when it comes to games, but Nvidia positively trounces on AMD in machine learning)

Will that statement still be valid after the release of the Radeon 6000 series next month?


Re: Hardware best practices

Posted: Thu Oct 29, 2020 8:48 pm
by abigflea

AMD doesn't focus on AI/compute. Highly unlikely they will make much changes to even begin to compete.
Also there's an consideration that the software layer that allows AMD cards to do the compute will need to be updated.

I did see a rumor that they were making some compute only cards instead of focused on gaming and that, if it's true, could be interesting but I better see it my hands or Bryanlyons, before I make any assumptions

Edit: rumor is called CDNA


Re: Hardware best practices

Posted: Fri Oct 30, 2020 3:17 pm
by bryanlyon
Boogie wrote: Thu Oct 29, 2020 7:53 pm

Nvidia is far superior to AMD (You may debate this statement when it comes to games, but Nvidia positively trounces on AMD in machine learning)

Will that statement still be valid after the release of the Radeon 6000 series next month?

Yes. The problem isn't the cards (which look fine) but the software around the cards. Nvidia has spent a TON of money and effort around Cuda and getting it well optimized for Machine Learning. It's simply not been a focus of AMD and they are WAY behind. At best, identical "gaming" performance cards would mean 1/2 the speed in Machine Learning.

Nvidia is the king of ML and probably will be for years to come since it'd be very hard to break their momentum.


Re: Hardware best practices

Posted: Fri Nov 06, 2020 3:57 pm
by hyperX321

Is it okay or possible to undervolt the GPU for DFL?
is there any possible impack to the results? TIA


Re: Hardware best practices

Posted: Fri Nov 06, 2020 10:13 pm
by torzdf

We don't support DFL, but in principle, undervolting is fine.


Re: Hardware best practices

Posted: Fri Nov 13, 2020 8:26 am
by Lonesome007

I am using intel i3 8100 (4cores 4 threads), which video card is recommended to avoid bottleneck throttling?
And could you please give a ranking list for deep learning power of common video cards? just by your experience opinion is ok.
Ex: rtx 2060 is 100%, how much is 1660s, 1660, 1060, rx580, 1650s, 1650, rx570, 1050ti, 1050.
Thank admins and you all!!!!!!


Re: Hardware best practices

Posted: Fri Nov 13, 2020 11:44 pm
by Boogie

Are you using VGA as a synonym for video card, or does it mean something else?


Re: Hardware best practices

Posted: Sat Nov 14, 2020 3:00 am
by Lonesome007
Boogie wrote: Fri Nov 13, 2020 11:44 pm

Are you using VGA as a synonym for video card, or does it mean something else?

Yes, my bad, edited, thank you 😍


Re: Hardware best practices

Posted: Sat Nov 14, 2020 3:06 am
by bryanlyon

It's impossible to specify one card vs another to that degree. It depends on a lot. Generally read the guide at the beginning of this thread and if you have SPECIFIC questions feel free to ask. But that's a very vague (and circumstantial) question.


Re: Hardware best practices

Posted: Thu Dec 03, 2020 5:25 pm
by akostadinov

Hi. I have a leftover Vega 10 16GB (Radeon Frontier Edition) from another project. I thought to try out faceswap with it. Which driver/settings would you recommend with it (ROCm or AMD pro)?

edit: Thanks to @torzdf's answer that gave me a pointer of what to look for. Searching for PlaidML ROCm sent me to Phoronix benchmark of RX6800 vs PlaidML vs ROCm, so I hope to be able to do that at least with ROCm 4.0. Good to know proprietary drivers work at least. I'll try to avoid them if possible though.


Re: Hardware best practices

Posted: Fri Dec 04, 2020 10:34 am
by torzdf

RE: the first question. plaidML is what is supported for AMD, so that will need AMDGPU Pro

Some people have had some success compiling TF with ROCm support, but we don't directly support this (as it's Linux only), but you can search the forum for this if it's the way you want to go.


Re: Hardware best practices

Posted: Fri Dec 04, 2020 5:15 pm
by jpebcac

OK. Here is my question.

I currently have a 1080GTX - non-TI, 8GB.

I recently received a 2060 (ROG Strix GeForce RTX™ 2060 OC edition 6GB GDDR6)

Now, the 2060 = 6Gb. The 1080 is 8Gb. When I run Villain, I can only use a set of 8 right now (it bombs any higher). Running a Ryzen 5950 + 64Gb RAM, all NVME disks. Am I better off sticking with the 1080 or the 2060 for Realswap?