Thanks for this post! This is exactly the kind of information and research that I would like to see from users.
I look forward to seeing how your model gets on
Thanks for this post! This is exactly the kind of information and research that I would like to see from users.
I look forward to seeing how your model gets on
Yes. Linked from the very first post
viewtopic.php?p=5367#p5367
I couldn't tell you batch size, just based on settings. See here for finding the best batch size yourself:
Your use is very edge-case, as I have not seen it raised before.
Short answer: no, and not something I'm likely to add, I'm afraid.
Corrupted model.
Lower learning rate + rollback 50k
It's been a long time since I used DLight, but this is not an issue I have seen before.
Generally that model needs a lower LR (3.5e-5) as it tends to collapse otherwise.
That may solve your issue, it may not.
This is a connection issue with your setup/conda.
Do this and try again. Hopefully it will resolve any Conda related issues:
app.php/faqpage#f1r1
Ok, good to know. Thanks. I will look at the potential for choosing to put that on cpu.
Looks like model corruption to me.
You'd need to try rolling back 50k or so, lowering learning rate a bit and continuing.
Removing the bug tag from this as -d, --distributed is deprecated in favour of -D, --distribution-strategy
No. It is purely a numbers thing. I doubt I would implement anything like this, because if you're at the point of reviewing every single frame, then you might as well review every single face. At which point, you're manually going through everything anyway, so could build your training set that way.
Which mask were you using? If landmarks based mask, then the GPU isn't used at all. The mask is generated from landmarks.
If it was an NN based mask, it would seem unlikely. However, let me know which one, because if it is as fast on CPU, then that is something I would want to look at.
Protip: Don't report the forum owner for spam.
It is in the convert guide. Literally covered in the first section. Also this:
viewtopic.php?t=2090