Page 1 of 1
Tesla m40 for deep learning
Posted: Fri Apr 24, 2020 7:32 pm
by lmcke169
Good afternoon,
I recently purchased a Tesla m40 to use for deepfakes as my 1070 was limiting my model resolution and I am hoping the double precision speed can help prevent model collapse.
Has anyone had experience getting one of these going in a consumer desktop? They aren't exactly plug and play. Please let me know if you have any tips for it? I won't be able to try them for a day or 2 as I need to model and print a fan shroud attachment to cool the card but I will be checking this thread regularly. Thanks!
Re: Tesla m40 for deep learning
Posted: Fri Apr 24, 2020 7:43 pm
by bryanlyon
An M40 is basically the same as the 9xx series Titan X. They have the same Chip (GM200) and the only differences would be of power and some slight core speed differences. Honestly, you'll be better off using the 1070. The M40 will be slower, and likely less stable since you'll almost definitely have some driver issues. You may be thinking of half precision which the M40 could maybe do better, but we don't currently support Half Precision training.
Look at the top 2 lines of the following chart. That compares the same chip in your M40 against your 1070 in Tensorflow. You can clearly see that your 1070 is a much better choice for faceswap.

Re: Tesla m40 for deep learning
Posted: Sat Apr 25, 2020 10:58 pm
by more11o
The 8 pin power connector isn't the same electrically as consumer GPUs. I believe its the same as CPU 8 pin but please double check.
I'm not sure what size shroud you are printing but you will probably need the next size up and a high pressure fan. Cooling 250w with a makeshift blower will not be easy.
I would second Bryanlyon in saying the 1070 is a better choice, 100w less power for much more performance.
Re: Tesla m40 for deep learning
Posted: Fri May 01, 2020 12:24 pm
by lmcke169
Thank you for the responses. It didn't email me about them for some reason.
I have the m40 hooked up finally and right now my cooling situation sucks. I 3d printed a shroud that connected to one of my terrible case fans. It thermal throttles constantly but I purchased a high speed fan and shroud from ebay that should help and I might take it apart and liquid metal it.
I am not using this m40 for speed. I am using it for its 24gb of vram which lets me train a higher resolution and batch size model. With it throttling right now its a fraction of the speed of my 1070 but I can push a 320 res model with 8 per batch vs the 1070 only handling 196 and 6 per batch.
Getting it running with deepfacelab was surprisingly easy. I cant use the full driver for it because of the 1070 but I was able to manually install the driver by unpacking the installer and adding it to device manager. Its not seen by most tools (furmark, nzxt cam, cinebench) but deepfacelab and msi afterburner were right on it. I will be trying with faceswap after I give it a few days on deepfacelab.
Re: Tesla m40 for deep learning
Posted: Mon May 04, 2020 7:53 pm
by Aubrey Drake Graham
can we dissuss this further on discord? I am looking into buying one. Aubrey Drake Graham#1991