Hi guys! Thank you for the app!
I am choosing a configuration for my new PC and wonder what is better for training: one RTX 2080Ti with 11 GB or 3x RTX 2070 Super with 8 GB. Both options are for nearly the same price. What do you think?
I actually differ from Torzdf on this one. I believe that the 3x 2070 supers will give you faster results in training. HOWEVER, this will ONLY improve training. Extract, Convert, and all the tools will only use 1 GPU for most things, so you wont see benefits there.
The only exception is if you wanted to use a larger model that doesn't fit in 8gb of ram even at BS=1. However, the gains from that would be far less than the cost of losing 3x GPUs in my mind.
This IS a decision of complexity however, since a 3x gpu system means that you'll have to design your system more carefully to deal with the lack of sufficient PCI-E lanes (Unless you're on Threadripper or something) meaning that your GPUs wont all be using the 16 lanes that they work best with. Also power is a serious consideration, 3x GPUs requires a very hefty power supply and I wouldn't be surprised if you need a 1000w supply to keep those GPUs happy.
If these are not an issue, I'd definitely choose the 3x 2070 supers over the 2080 ti, you'll get faster training and be able to do larger batch sizes.
Thank you bryanlyon! I'm building the new system from scratch so I'm aware of that complexity and special system reqirements for 3 GPUs. 3 GPUs option is preferable for my other tasks, so I wish to know if training with 3x 2070S is possible and significantly faster than with one 2080ti. BTW DeepFaceLab doesn't have multi GPU support for now.
What is the kind of model that can't fit in 8gb at BS=1?
Nvlink shared memory only works with an RTX Titan or Quadro card so wont affect your situation in addition, only 2 cards will work unless you buy a system direct from Nvidia (such as the DGX).
Several of our models are configurable and have flexible options for how it's built. In those cases, a large model could strain any card, let alone just 8gb. I don't recommend using those though.
Yes, DFL does not support multiGPU. We however, do support it. It works just fine, just set the BS to a multiple of the GPUs and use the -g flag with the number of GPUs (or choose the correct number in the gui) and we'll use all of your GPUs.
Mainly pci-e lanes. Most setups could actually go slower with 4 cards instead of 3 since it'd drop the number if pci-e lanes available for each card.