GPU RAM Size vs. TFLOPS: which matters more?

Talk about Hardware used for Deep Learning


Locked
User avatar
haoylee
Posts: 9
Joined: Thu Nov 18, 2021 3:16 pm
Has thanked: 1 time
Been thanked: 1 time

GPU RAM Size vs. TFLOPS: which matters more?

Post by haoylee »

I'm convinced that nVidia GPU is the best and only solution to get a quick result right now, but the graphic card market is still insane right now so I want to be more cautious.

Is GPU's ram size more important, or is TFLOPS performance? Or any other parameter? (I don't know which calculation speed is the right one to measure the performance on faceswap, only that bigger ram is better.)

I'm currently using 2080 Ti, with 11Gb ram and 14.23TFLOPS.

With the similar data is 3060, with 12Gb ram and 12.74TFLOPS, and fairly low price.
3080Ti: 12Gb ram and 34.1TFLOPS, at acceptable high price.
3090: 24Gb ram and 35.58TFLOPS, at unreasonable price right now in Chinese market.

Again, I know nothing about the relationship between the TFLOPS number and the performance on faceswap, not sure if there is any, just want to know if there is any other decisive specs of GPU card to improve the training speed, apart from GPU ram size.

User avatar
torzdf
Posts: 2651
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 129 times
Been thanked: 622 times

Re: GPU RAM Size vs. TFLOPS: which matters more?

Post by torzdf »

[mention]bryanlyon[/mention] will be better able to answer, but I'm not sure TFLOPS is the best measure for ML performance.

Either way, speed vs ram size comes down purely to preference. Larger RAM size will enable you to train bigger models. Faster GPUs will enable you to train models quicker. Ultimately it comes down to whether you want to be able to train larger (more complex) models more slowly, or whether you want to be able to train smaller models more quickly.

My word is final

User avatar
haoylee
Posts: 9
Joined: Thu Nov 18, 2021 3:16 pm
Has thanked: 1 time
Been thanked: 1 time

Re: GPU RAM Size vs. TFLOPS: which matters more?

Post by haoylee »

So maybe, economically speaking, RTX 3060 is a better choice over 3070 or 3080, right?

User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: GPU RAM Size vs. TFLOPS: which matters more?

Post by bryanlyon »

3060 is my recommended card. See viewtopic.php?f=16&t=10

That said, right now it's the card you can get more than the card you want.

User avatar
haoylee
Posts: 9
Joined: Thu Nov 18, 2021 3:16 pm
Has thanked: 1 time
Been thanked: 1 time

Re: GPU RAM Size vs. TFLOPS: which matters more?

Post by haoylee »

bryanlyon wrote: Tue Nov 30, 2021 3:51 pm

3060 is my recommended card. See viewtopic.php?f=16&t=10

That said, right now it's the card you can get more than the card you want.

Hi Bryan,

I get the general point that 3060 is a relatively good choice right now, but since I have a 2080ti, I would like to know if this 1GB difference of VRAM really matters. After all, apart from faceswap application, I also use my computer for some 3D modeling and real-time rendering tasks, so maybe I should just keep my 2080ti?

User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: GPU RAM Size vs. TFLOPS: which matters more?

Post by bryanlyon »

Sure, 2080 ti is a fine card. You wont be able to go as big but in many tasks your card will run faster.

It's a balancing act with the current cards. Especially when you consder the lack of availability in the retail channels. In general, unless you have specific needs, I wouldn't recommend updating at this time.

Locked