It should work, I’m doing the opposite, training first on fp16 and then fine tuning on fp32. When you train on fp16 the entire saved model will be fp32.
No, sorry
You can ask here, maybe there are new uptades :
It should work, I’m doing the opposite, training first on fp16 and then fine tuning on fp32. When you train on fp16 the entire saved model will be fp32.
No, sorry
You can ask here, maybe there are new uptades :