Can a 2060s or 2060 run dfl-sae at 256 resolution?

Talk about Hardware used for Deep Learning


Locked
User avatar
swapration
Posts: 23
Joined: Thu Sep 10, 2020 1:21 am

Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by swapration »

With mixed precision, is the 8gb/6gb vram of the 2060s/2060 enough for dfl-sae at a resolution of 256? Or is it not enough memory?
If it is enough to run it, what kind of EGs/sec does it run it at?

User avatar
abigflea
Posts: 182
Joined: Sat Feb 22, 2020 10:59 pm
Answers: 2
Has thanked: 20 times
Been thanked: 62 times

Re: Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by abigflea »

It should.

The speed is relative to many variables but it should be reasonable. Just get the biggest batch size you can.
Usually after 24 hours you can see if your data is going to work.

When you do a final train to get it really really good for a convert , you'll probably end up running it for 3 or 4 days. I have often run villain to 1 mil, but sometimes 300k is good enough.

:o I dunno what I'm doing :shock:
2X RTX 3090 : RTX 3080 : RTX: 2060 : 2x RTX 2080 Super : Ghetto 1060

User avatar
cosmico
Posts: 95
Joined: Sat Jan 18, 2020 6:32 pm
Has thanked: 13 times
Been thanked: 35 times

Re: Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by cosmico »

To help you out, I just tried to create a DFLSAE @ 256 model on my project that I'm currently using with original.
Ans I was unable to get a single 256 model starting with 6 different settings. The 6 settings I tried were:
.
DFLSAE 256 Allow Growth, Mixed precision, Architecture DF, Encoders: Default, clipnorm:on, Batch size 1 = no
Then I changed it to liae architecture with same settings and nope.
The I tried same settings with clipnorm off for both DF and Liae and both were nope.
Then I tried same settings with allow growth and clipnorm off for both DF and Liae and the result was nope.
.
Personally the biggest model I've got to work with even slightly reasonable speed was a 160 pixel size model, I'm really struggling to remember the speed but it was somewhere between 2 to 15 eg/s. I abandoned it after I realized that with the data I had, I estimated it would take 400 hours on the very very very low end.
.
My Hardware
GeForce RTX 2060
16Gb RAM @ 3300
Ryzen 5 3600 6-Core
.
Maybe you dont need a 256 sized model though. Maybe you can get results your happy with a smaller model. I made this little mockup showing what type of quality you can get if you hypothetically trained your model to the max and then stretched it out to cover faces of varying sizes. This is a 4000x2000 sized image, so open it and zoom in Image
.
.
If Abigflea can actually get his 2060 to run a 256 sized model, Then I'm very very interested in how he got it working.

User avatar
abigflea
Posts: 182
Joined: Sat Feb 22, 2020 10:59 pm
Answers: 2
Has thanked: 20 times
Been thanked: 62 times

Re: Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by abigflea »

On 6GB?
I dont have that card installed currently, Computer is a mess of wires.
I do know I was doing it in Linux. Was a small batch, like 2.
I know I had toned down the Encoder dims a few %, Clipnorm off... its been a while

176px and 192px seemed to be just fine for anything I was doing.
This is a good explanation!

DFL-SAE and villain surely eat a lot of ram.

:o I dunno what I'm doing :shock:
2X RTX 3090 : RTX 3080 : RTX: 2060 : 2x RTX 2080 Super : Ghetto 1060

User avatar
swapration
Posts: 23
Joined: Thu Sep 10, 2020 1:21 am

Re: Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by swapration »

Thanks that clears up a lot.
Sounds like it really is best to go for 8gb at the minimum. You guys think the 3060, with it's 12gb, will turn out to be the best 'bang for your buck', once 30 series is fully supported?

User avatar
abigflea
Posts: 182
Joined: Sat Feb 22, 2020 10:59 pm
Answers: 2
Has thanked: 20 times
Been thanked: 62 times

Re: Can a 2060s or 2060 run dfl-sae at 256 resolution?

Post by abigflea »

I still think a 2070 would be just fine unless you are working towards some production schedule.
Just have some patience.

Additionally, 30XX still not directly supported.

:o I dunno what I'm doing :shock:
2X RTX 3090 : RTX 3080 : RTX: 2060 : 2x RTX 2080 Super : Ghetto 1060

Locked