multigpu not working in GUI version

Installing and setting up FaceSwap


Forum rules

Read the FAQs and search the forum before posting a new topic.

Please mark any answers that fixed your problems so others can find the solutions.

Locked
User avatar
frankywashere
Posts: 1
Joined: Thu Apr 13, 2023 6:37 pm

multigpu not working in GUI version

Post by frankywashere »

Hi, does anyone know how to activate multi-gpu in the GUI version of faceswap? I have (4) 3090 RTX and (1) 3060 RTX. It's only using one card. Do i have to change something in the settings?

Frank

User avatar
torzdf
Posts: 2672
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 131 times
Been thanked: 625 times

Re: multigpu not working in GUI version

Post by torzdf »

It depends what you are doing. Multi-GPU is only supported for training.

You have the option to exclude GPUs in the GUI (towards the bottom). When training, you need to select a distribution strategy (in the training section alongside Batch Size and Iterations). You will want to select "Mirrored" strategy

My word is final

User avatar
MaxHunter
Posts: 194
Joined: Thu May 26, 2022 6:02 am
Has thanked: 177 times
Been thanked: 13 times

Re: multigpu not working in GUI version

Post by MaxHunter »

I have a 3080ti and a 3090.

First, when you say you have (4) 3090, do you mean it's in the 4th slot, or do you mean you have 4 separate 3090s? I've been told the more GPUs the slower it'll run.

Unless you are using an SLI to combine 3090s it will not use both GPUs. However, all is not lost, it's my understanding that it will use both GPUs memory. Depending on the size of your model you can achieve higher batch numbers.

For myself, I use extremely large models, and dedicate my 3090 (if you're on windows you can dedicate your cards to different programs,) to training. I then dedicate my 3080ti to gaming and surfing while training.

You won't realize how helpful that is until you don't have it.

User avatar
torzdf
Posts: 2672
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 131 times
Been thanked: 625 times

Re: multigpu not working in GUI version

Post by torzdf »

MaxHunter wrote: Sat Apr 15, 2023 3:10 pm

I've been told the more GPUs the slower it'll run.

I think you've misinterpreted that somewhere. It's diminishing returns. The speed increases come from being able to run larger batches. However 2x GPUs does not mean that you will get 2x training speed. It may give you another 50% (assuming doubling the batch size as compared to a single GPU). As you add more GPUs its not linear, so that percentage increase is not as much. It will still increase the speed (assuming batch size is scaled up at the same rate), but with each extra GPU the speed increase will be less.

Unless you are using an SLI to combine 3090s it will not use both GPUs.

Again, sorry to do this to you, but this is also not true. SLI is not required for distributed training.

Last edited by torzdf on Sun Apr 16, 2023 11:26 pm, edited 1 time in total.

My word is final

User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: multigpu not working in GUI version

Post by bryanlyon »

To be even more clear, adding more GPUs adds overhead, and there is a point where more GPUs will actually just slow things down more instead of speeding things up.

For example, when I last tested many GPUs it was on 1080s. 6x 1080s was faster than 8x 1080s. This was because they were limited by the PCI bus relative to their speed. Now with faster PCI-E connections and tensorcores the equation may be different, but that was the last time I tested large number of GPUs.

User avatar
MaxHunter
Posts: 194
Joined: Thu May 26, 2022 6:02 am
Has thanked: 177 times
Been thanked: 13 times

Re: multigpu not working in GUI version

Post by MaxHunter »

Unless you are using an SLI to combine 3090s it will not use both GPUs.

Again, sorry to do this to you, but this is also not true. SLI is not required for distributed training.

Yes, but if they're s on a 3090/3060 setup, they can't run a model over 12gb because of the 3060 memory limitations. So if they have a large model (ie custom 512 or 1024) they are not going to be able to do a "distributed" training because the model can't be set on the 3060s memory, where as SLI they would be able to do that because the model sits on the 3090 and is using the tensors of the 3060 in tandem(theoretically because 3060 can't be sli'd.) Right?

Also, @frankywashere ...

Listen to these guys not me. 😆

Last edited by MaxHunter on Wed Apr 19, 2023 7:17 pm, edited 5 times in total.
User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: multigpu not working in GUI version

Post by bryanlyon »

No, SLI does not let them automatically share memory. Nothing in FaceSwap enables memory sharing between GPUs. SLI is completely a gaming technology. Even the NVLink that replaced it for a couple of generations requires active use of the technology to share memory. The only thing that NVLink will do is reduce the overhead of transferring data between the GPUs. It's still overhead and nothing lets you share memory.

Locked