trainable=False option

Are there straight-forward ways to set specific layers to be non-trainable, i.e. weights do not update?

I’d like to fine-tune only the last few layers if possible.
I was hoping that the recent transfer learning commits offer such a capability, but looks like it only re-initialize the selected layers, but still trains the entire set of weights.

At first glance, it seems that we can set “tfv1.get_variable” in “variable_on_cpu” to trainable=False (for the non-trainable layers desired) and also initialize tf.contrib.rnn.LSTMBlockFusedCell to trainable=False? Would this do the trick or are there other potential pitfalls?

Any pointers would be very much appreciated.

Setting trainable=False should be enough.

When loading from a pre-trained model, setting trainable=False in dense doesn’t seem to produce any effect. I still see all the vars in tfv1.trainable_variables()
Other issue is that tf.contrib.rnn.LSTMBlockFusedCell doesn’t take trainable argument.

Looks like the right way to do things is to pass the list of trainable vars to compute_gradient. Will try this instead.