I just wanted to give a little feedback to an issue i've had and could not really solve until yesterday.
The issue was that Python crashed after first error was that GPU was not used. (from this thread) viewtopic.php?t=2326
I post this topic now here because i was able to solve that issue yesterday with a complete fresh manual installation.
I kept Anaconda just delelted FS and FS ENV in conda and took an older branch of FS. Not exactly sure if it was R2.1 or R2.2 but created the folder in the directory where it should be, added the ENV in Conda with Python 3.7 instead of 3.8 this time and it seems to have solved the issue.
Nvidia-Smi still says i have Cuda 11.4 installed but i guess Cuda is backward compatible as FS installation have detected Cuda 10.2.
With terminal i've installed nvidia-ml-py3 and tensorflow-gpu 2.2.0 manually and FS was starting normally. I still needed some models for extracting but i've still got them on my system and moved them in the correct directory.
Extracting worked fine just training gave me some error messages but i used old files i've created before. I extracted the older files again, delelted the files i didn't needed and started training again and it worked fine, so the error may be was because they was created/trained with a differend Version of FS.
While extracting and training i've checked Task Manager for CPU Usage and GPU-Z for GPU Memory usage and now FS is working at 100% with the GPU memory and does not use the CPU. Training is as fast as i remember it.
On my System i haven't changed anything so maybe it was because of the Python Version i've installed or maybe newer Versions of FS does not work with my System.
Just one thing i recognized before was that there was a few issues with newer Versions of Tensorflow GPU as it said i need a specific version higher than the installed version but the installed version was higher than the displayed one. (i can not exactly say what version it was but FSwap thought it was lower than the required version).
So FS works fine as it should again for me.