I'm trying to use the functional API to have a shared layer where only one of the paths is trainable:
a_in = Input(x_shape)b_in = Input(x_shape)a_out = my_model(a_in) # I want these weights to be trainableb_out = my_model(b_in) # I want these weights to be non-trainable (no gradient update)y_out = my_merge(a_out, b_out)full_model = Model(inputs=[a_in, b_in], outputs=[y_out])full_model.compile(...)
I can't figure out how to do this though. Setting the my_model trainable flag affects both layers. I can compile 2 different models with different trainable flags, but then I can't see how I could combine 2 pre-compiled models to optimize my single merged cost function.
Is this even possible to do with Keras? And if not, is it possible in TensorFlow?