Relu function as defined in keras/activation.py is:
def relu(x, alpha=0., max_value=None):
return K.relu(x, alpha=alpha, max_value=max_value)
It has a max_value which can be used to clip the value. Now how can this be used/called in the code? I have tried the following: (a)
model.add(Dense(512,input_dim=1))
model.add(Activation('relu',max_value=250))
assert kwarg in allowed_kwargs, 'Keyword argument not understood:
' + kwarg
AssertionError: Keyword argument not understood: max_value
(b)
Rel = Activation('relu',max_value=250)
same error
(c)
from keras.layers import activations
uu = activations.relu(??,max_value=250)
The problem with this is that it expects the input to be present in the first value. The error is 'relu() takes at least 1 argument (1 given)'
So how do I make this a layer?
model.add(activations.relu(max_value=250))
has the same issue 'relu() takes at least 1 argument (1 given)'
If this file cannot be used as layer, then there seems to be no way of specifying a clip value to Relu. This implies that the comment here https://github.com/fchollet/keras/issues/2119 closing a proposed change is wrong... Any thoughts? Thanks!
Value error: unknown activation function:relu_advanced
– Chaste