Page 1 of 1

Does VRAM Requirements Fall

Posted: Sat Aug 27, 2022 7:13 am
by MaxHunter

Do VRAM requirements fall as the model grows?

I'm sorry if this is a newbie question but I can't understand why one model with the same settings fails while another model is exceptional. I started to wonder if I had my previous model on Central Storage distribution and later switched over, because when I started training my new model it kept crashing until I switched over to central storage. After several thousand "its" of training can I switch back to default distribution?


Re: Does VRAM Requirements Fall

Posted: Sat Aug 27, 2022 7:16 am
by torzdf

No, but different iterations of Tensorflow may handle things differently though in terms of VRAM allocation.

I do suspect that there is some kind of leak going on in Tensorflow recently. Not big, but enough that a model may OOM after several thousand iterations (this should not be able to happen, as TF gathers the all the VRAM it requires at the start).