You should be able to pass a trainable = False
argument to your layer definition, or set the layer.trainable = False
property after creating your layer. In the latter case you need to compile after the fact. See the FAQ here.
You can then set constant weights for the layer by passing a kernel_initializer = initializer
argument. More information on initializers can be found here. If you have a weight matrix already defined somewhere, I think you will need to define a custom initializer that sets the weights to your desired values. The link shows how to define custom initializers at the bottom. Something as simple as the following might work, assuming you have my_constant_weight_matrix
defined:
def my_init(shape, dtype=None):
# Note it must take arguments 'shape' and 'dtype'.
return my_constant_weight_matrix
model.add(Conv2D(..., kernel_initializer=my_init)) # replace '...' with your args
That said, I have not verified, and when I did a Google search I saw a lot of bug reports pop up about layer freezing not working correctly. Worth a shot though.