Tesla m40 for deep learning

Talk about Hardware used for Deep Learning


Locked
User avatar
lmcke169
Posts: 4
Joined: Fri Apr 24, 2020 7:26 pm
Has thanked: 1 time

Tesla m40 for deep learning

Post by lmcke169 »

Good afternoon,

I recently purchased a Tesla m40 to use for deepfakes as my 1070 was limiting my model resolution and I am hoping the double precision speed can help prevent model collapse.

Has anyone had experience getting one of these going in a consumer desktop? They aren't exactly plug and play. Please let me know if you have any tips for it? I won't be able to try them for a day or 2 as I need to model and print a fan shroud attachment to cool the card but I will be checking this thread regularly. Thanks!

User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: Tesla m40 for deep learning

Post by bryanlyon »

An M40 is basically the same as the 9xx series Titan X. They have the same Chip (GM200) and the only differences would be of power and some slight core speed differences. Honestly, you'll be better off using the 1070. The M40 will be slower, and likely less stable since you'll almost definitely have some driver issues. You may be thinking of half precision which the M40 could maybe do better, but we don't currently support Half Precision training.

Look at the top 2 lines of the following chart. That compares the same chip in your M40 against your 1070 in Tensorflow. You can clearly see that your 1070 is a much better choice for faceswap.

Image

User avatar
more11o
Posts: 6
Joined: Wed Apr 08, 2020 1:22 am

Re: Tesla m40 for deep learning

Post by more11o »

The 8 pin power connector isn't the same electrically as consumer GPUs. I believe its the same as CPU 8 pin but please double check.

I'm not sure what size shroud you are printing but you will probably need the next size up and a high pressure fan. Cooling 250w with a makeshift blower will not be easy.

I would second Bryanlyon in saying the 1070 is a better choice, 100w less power for much more performance.

User avatar
lmcke169
Posts: 4
Joined: Fri Apr 24, 2020 7:26 pm
Has thanked: 1 time

Re: Tesla m40 for deep learning

Post by lmcke169 »

Thank you for the responses. It didn't email me about them for some reason.

I have the m40 hooked up finally and right now my cooling situation sucks. I 3d printed a shroud that connected to one of my terrible case fans. It thermal throttles constantly but I purchased a high speed fan and shroud from ebay that should help and I might take it apart and liquid metal it.

I am not using this m40 for speed. I am using it for its 24gb of vram which lets me train a higher resolution and batch size model. With it throttling right now its a fraction of the speed of my 1070 but I can push a 320 res model with 8 per batch vs the 1070 only handling 196 and 6 per batch.

Getting it running with deepfacelab was surprisingly easy. I cant use the full driver for it because of the 1070 but I was able to manually install the driver by unpacking the installer and adding it to device manager. Its not seen by most tools (furmark, nzxt cam, cinebench) but deepfacelab and msi afterburner were right on it. I will be trying with faceswap after I give it a few days on deepfacelab.

User avatar
Aubrey Drake Graham
Posts: 1
Joined: Mon Apr 13, 2020 11:58 pm
Has thanked: 1 time

Re: Tesla m40 for deep learning

Post by Aubrey Drake Graham »

can we dissuss this further on discord? I am looking into buying one. Aubrey Drake Graham#1991

Locked