Ryzen Threadripper vs Graphics Card training

Talk about Hardware used for Deep Learning


Locked
User avatar
ugramund
Posts: 27
Joined: Sat Feb 20, 2021 3:56 am
Location: India
Has thanked: 25 times
Been thanked: 4 times

Ryzen Threadripper vs Graphics Card training

Post by ugramund »

I know this answer cannot be given without real world testing but would like to know what Faceswap developers think
about the monstrous Ryzen Threadripper series and can we compare them to any graphics card when it comes to
training a model. For comparison we can take the Origianl & Villian model.

AMD Ryzen Threadripper 3990X Processor - 64 cores 128 threads
AMD Ryzen Threadripper 3970X Processor - 32 cores 64 threads
AMD Ryzen Threadripper 3960X Processor - 24 cores 48 threads

Do any of the above CPUs are comparable to a GTX 1650 Super, GTX 1660 Super or a RTX 2060 .
If not, can they be compared to a GTX 1070, GTX 1060 or at least GTX 1050ti.

Apart from the answers from the Faceswap developers, I would love to see if someone who really owns this
CPU can give a real world CPU training results.

And why do CPUs lag behind Graphics card when it comes to training. Is it because not enough CPU based
machine learning softwares are developed.

In love with the the RealFace model.

User avatar
torzdf
Posts: 2651
Joined: Fri Jul 12, 2019 12:53 am
Answers: 159
Has thanked: 129 times
Been thanked: 622 times

Re: Ryzen Threadripper vs Graphics Card training

Post by torzdf »

Yes we can compare.

Any GPU that can do ML will wipe the floor with these processors for Machine Learning.

CPUs are not made for matrix calculations. GPUs are.

FWIW before I bought an Nvidia GPU, I tried training on a Xeon 40 core. It was insanely slow. My ancient AMD R9 290 absolutely wiped the floor with it.
My old Nvidia 1080 wiped the floor with that.

Ultimately, look at it like this. Would you try to run Cyberpunk 2077 on your thread ripper? Of course not. Even an old GPU would run rings around it.

My word is final

Locked