I am trying out transfer learning at the moment. In a paper by Josh Meyer (http://jrmeyer.github.io/misc/MEYER_dissertation_2019.pdf) it is mentioned that there is a possibility to freeze layers, but I can’t find any information on how to do that. Is it still not yet possible in newest branch? If it is possible is there a flag to enable freezing layers?
To freeze last layer use flag
–drop_source_layers 1
it is described in documentation
https://deepspeech.readthedocs.io/en/v0.7.0/TRAINING.html#transfer-learning-new-alphabet
1 Like
This flag removes layers, not freezes them. The remaining layers will be fine-tuned: " allows you to specify how many layers you want to remove from the pre-trained model. For example, if you supplied --drop_source_layers 3
, you will drop the last three layers of the pre-trained model".
I am looking how to freeze layers
So no ability to freeze layers?