I'm convinced that nVidia GPU is the best and only solution to get a quick result right now, but the graphic card market is still insane right now so I want to be more cautious.
Is GPU's ram size more important, or is TFLOPS performance? Or any other parameter? (I don't know which calculation speed is the right one to measure the performance on faceswap, only that bigger ram is better.)
I'm currently using 2080 Ti, with 11Gb ram and 14.23TFLOPS.
With the similar data is 3060, with 12Gb ram and 12.74TFLOPS, and fairly low price.
3080Ti: 12Gb ram and 34.1TFLOPS, at acceptable high price.
3090: 24Gb ram and 35.58TFLOPS, at unreasonable price right now in Chinese market.
Again, I know nothing about the relationship between the TFLOPS number and the performance on faceswap, not sure if there is any, just want to know if there is any other decisive specs of GPU card to improve the training speed, apart from GPU ram size.