thanks for that final word.
back to scratch for the model.
Search found 5 matches
- Thu Dec 03, 2020 12:53 pm
- Forum: Training Discussion
- Topic: Changing coverage mid training
- Replies: 2
- Views: 1812
Re: Changing coverage mid training
- Wed Dec 02, 2020 5:32 pm
- Forum: Training Discussion
- Topic: Changing coverage mid training
- Replies: 2
- Views: 1812
Changing coverage mid training
I trained a model using villain using 85% coverage. After 250k iterations I did a test conversion and the results are reasonable, but there is a shadow on the temple/side of the forehead. I was thinking I was too aggressive with the coverage in the model, so is it possible to reduce coverage mid tra...
- Wed Nov 18, 2020 9:39 pm
- Forum: Convert Discussion
- Topic: Converted faces are blurry
- Replies: 16
- Views: 28773
Re: DLight model still blurry after 400k iterations
I then did the following: From training A -> B, I switched training B -> A. Focus on the two different sides has vastly improved conversion which was done at 350K and 450K iterations. So my suggestion is when you have trained A->B 400K iterations, switch to B->A it also helps strengthen foundation ...
- Mon Oct 26, 2020 3:59 pm
- Forum: Hardware
- Topic: GPU upgrade on a budge: 1070 8GB vs. 1660 6GB
- Replies: 9
- Views: 11897
Re: GPU upgrade on a budge: 1070 8GB vs. 1660 6GB
Thanks for the responses. Not much clarity, but plenty to think about. With divided opinions, it seems I can’t completely wrong with either option so that calmed some worries
Again, thanks for the help.
- Sat Oct 24, 2020 3:06 pm
- Forum: Hardware
- Topic: GPU upgrade on a budge: 1070 8GB vs. 1660 6GB
- Replies: 9
- Views: 11897
Re: GPU upgrade on a budge: 1070 8GB vs. 1660 6GB
To expand on this topic.
I’m working on a budget and is thinking about a 2060 (6GB) or a used 1070 or 1080 (8 GB). What’s better, tensorcores or 2 GB extra VRAM.