Hardware best practices

Talk about Hardware used for Deep Learning


User avatar
torzdf
Posts: 2621
Joined: Fri Jul 12, 2019 12:53 am
Answers: 155
Has thanked: 128 times
Been thanked: 610 times

Re: Hardware best practices

Post by torzdf »

bryanlyon wrote: Mon Nov 13, 2023 11:26 pm

The trouble with hardware recommendations since the 20xx series has been that you always have to add "if you can get it for a good price" to the end of ANY recommendation. Will a 4060 TI with 16gb of vram work? Definitely. Will it beat a $100 1080? Yes. Is it worthwhile to buy it for FaceSwap? That's ENTIRELY up to you.

16gb of Vram is good enough for any of our models. Most of which were designed to work with 8gb 2060 tis or 1070s. The 16gb of ram will be plenty for most FaceSwap tasks.

Is the 4060ti fast enough? Unquestionably. I started training with 2x 970s in SLI. They'd take days to do what a 4060ti could do in hours.

Is it the right price? That's up to you. For some people spending ANY money on FaceSwap is spurious as it's just "for the memes". Others spend thousands on A6000s and consider them a bargain.

So I repeat: Will a 4060 TI with 16gb of vram work? Definitely. Will it beat a $100 1080? Yes. Is it worthwhile to buy it for FaceSwap? That's ENTIRELY up to you.

I would also add to the above, that if you are wanting to do this professionally (as in full HD + ) then 16GB is unlikely to be enough. You'd really be looking at at least double that, but it becomes hard to recommend for this, as 'professionally' is a broad term, and the prices start to become somewhat insane.

Last edited by torzdf on Mon Nov 13, 2023 11:29 pm, edited 1 time in total.

My word is final

User avatar
trippod
Posts: 3
Joined: Mon Nov 13, 2023 8:23 am
Has thanked: 4 times

Re: Hardware best practices

Post by trippod »

could be more iteresting 2 3060 12gb in sli?

User avatar
torzdf
Posts: 2621
Joined: Fri Jul 12, 2019 12:53 am
Answers: 155
Has thanked: 128 times
Been thanked: 610 times

Re: Hardware best practices

Post by torzdf »

I have not linked 2 GPUs for training with NVLink (you'd need NVLink not SLI).

However, without NVLink 2 GPUs would just let you run bigger batch sizes, they would not allow you to load larger model.s as a copy of the model needs to be loaded onto each GPU.

You'd need to research NVLinking 2 GPUs to see whether it can be seen as 1 large GPU (that is the VRAM combined). I suspect it will not, and you will hit the same barrier as if you had 2 GPUs without NVLink.

My word is final

User avatar
bryanlyon
Site Admin
Posts: 792
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 215 times
Contact:

Re: Hardware best practices

Post by bryanlyon »

The only 30xx card with the nvlink fingers is the 3090. All others cannot do any sort of "linking". That said, you can still use multiple GPUs in Faceswap. It is diminishing returns where each additional GPU slows down the collective more, and like torzdf says, only the batch size can be increased with additional cards, not the size of the model directly.

Post Reply