Page 1 of 1

MULTI GPU - double speed at same batc size or not?

Posted: Sat Nov 28, 2020 10:13 am
by FuorissimoX

Hi everyone, for some time I tried Deefacelab but for several reasons I would go back to FS.

I would like to understand the multiGPU option if it works like on DFL or differently.

For example, on DFL I have these results:

1 x GPU BATCH 10 1200MS ITERATION TIME

2 X GPU BATCH 10 1200 ITERATION TIME

2 X GPU BATCH 20 ITERATION TIME 2400

In other words, I have no improvement. I expected results like this:

1 x GPU BATCH 10 1200MS ITERATION TIME

2 X GPU BATCH 10 600 ITERATION TIME

2 X GPU BATCH 20 ITERATION TIME 1200

What results could I expect on FACESWAP? It double iteration speed or i will have similar results from DFL?

Second question: FS support SLI? It will be better use in FS settings, 2 GPU or 1 GPU in SLI?

Many thanks


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sat Nov 28, 2020 7:56 pm
by dheinz70

I get similar results. See viewtopic.php?p=4146#p4146


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sat Nov 28, 2020 8:25 pm
by bryanlyon

We do not support DFL. Faceswap does get greatly increased speed on a properly setup Multi GPU system. It will not cut your time in half as there are a lot of things that cannot be parallelized but it will reduce total time per iteration.


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sat Nov 28, 2020 11:49 pm
by abigflea

With the condition of the same batch size, multi gpu verses single.
Sometimes it will train only somewhat faster 120% or so..
What you can do, is increase your batch size to almost 2X of a single card. That's when you really see the benefits. Your EGs/sec will go way up. Maybe not 200% more like 160%-185% depending on model, data, chipset, phase of the moon

To a degree, having a larger batch enables the model to train better/more efficient. There are limits, batches 80+ can make it train poorly or not at all. Usually not a issue with Realface and Villian.

Be aware, make sure both cards connect on the pcie bus at 8x or 16x. Less than that and it causes some serious lag time while the cards are communicating during distributed training.

Things like those 1X pcie mining riser cards MAY be kinda-sorta ok for a single card, but will murder your speed in distributed.


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sat Nov 28, 2020 11:53 pm
by abigflea

Sure DFL is set up differently than FS. Especially things like tensor support, make comparing them not very easy.


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sun Nov 29, 2020 7:03 am
by FuorissimoX

From what I read using two GPUs in parallel does not change anything as it also happens on DFL.

Using instead the GPU in SLI in theory there should be improvement because the system considers it as if it were a single GPU with double the performance.

Can anyone confirm?

Has anyone tried GPU in SLi?


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sun Nov 29, 2020 7:06 am
by bryanlyon

SLI is just for gaming and does NOT make the system treat it as a single card.

Faceswap specifically handles multiple GPUs allowing for each card to compute different images at the same time.

Many of our users (including myself) have used MultiGPU in Faceswap. It definitely works and increases the speed significantly.


Re: MULTI GPU - double speed at same batc size or not?

Posted: Wed Dec 16, 2020 11:30 pm
by 4dv3
bryanlyon wrote: Sun Nov 29, 2020 7:06 am

SLI is just for gaming and does NOT make the system treat it as a single card.

Faceswap specifically handles multiple GPUs allowing for each card to compute different images at the same time.

Many of our users (including myself) have used MultiGPU in Faceswap. It definitely works and increases the speed significantly.

How does Multigpu work for the new Rtx 3000 series and some rtx 2000 which have no sli support?
I know taht the only card capable of Sli/Nvilink is the rtx 3090 for the 3000s
So what if i have 2. rtx 3060tis for example, currently those cards have no way to be connected with an Sli bridge since they are not supported,
will Faceswap still recognize 2 gpus and work with them even if they are not connected with Sli or even supported by it?


Re: MULTI GPU - double speed at same batc size or not?

Posted: Thu Dec 17, 2020 12:28 am
by bryanlyon

Again, SLI is not the same as multiGPU. Using multiple GPUs for training doesn't require SLI and doesn't use traditional SLI at all.


Re: MULTI GPU - double speed at same batc size or not?

Posted: Fri Dec 18, 2020 6:26 pm
by 4dv3
bryanlyon wrote: Thu Dec 17, 2020 12:28 am

Again, SLI is not the same as multiGPU. Using multiple GPUs for training doesn't require SLI and doesn't use traditional SLI at all.

Do you know if both Gpus have to be the same model/card for multigpu? or it doesnt matter,, like say i have a 3080 and a 3060ti, will both work on the same system?


Re: MULTI GPU - double speed at same batc size or not?

Posted: Sat Dec 19, 2020 2:24 am
by dheinz70

Two similarly capable cards should work. The rate determining step will be the slowest card. Dunno if the 30x cards are supported yet.