Keras How to use max_value in Relu activation function
Asked Answered
C

4

8

Relu function as defined in keras/activation.py is:

    def relu(x, alpha=0., max_value=None):
      return K.relu(x, alpha=alpha, max_value=max_value)

It has a max_value which can be used to clip the value. Now how can this be used/called in the code? I have tried the following: (a)

    model.add(Dense(512,input_dim=1))
    model.add(Activation('relu',max_value=250))
    assert kwarg in allowed_kwargs, 'Keyword argument not understood: 
    ' + kwarg
    AssertionError: Keyword argument not understood: max_value

(b)

    Rel = Activation('relu',max_value=250)

same error

(c)

    from keras.layers import activations
    uu = activations.relu(??,max_value=250)

The problem with this is that it expects the input to be present in the first value. The error is 'relu() takes at least 1 argument (1 given)'

So how do I make this a layer?

    model.add(activations.relu(max_value=250))

has the same issue 'relu() takes at least 1 argument (1 given)'

If this file cannot be used as layer, then there seems to be no way of specifying a clip value to Relu. This implies that the comment here https://github.com/fchollet/keras/issues/2119 closing a proposed change is wrong... Any thoughts? Thanks!

Combatant answered 20/12, 2016 at 22:54 Comment(0)
J
11

You can use the ReLU function of the Keras backend. Therefore, first import the backend:

from keras import backend as K

Then, you can pass your own function as activation using backend functionality. This would look like

def relu_advanced(x):
    return K.relu(x, max_value=250)

Then you can use it like

model.add(Dense(512, input_dim=1, activation=relu_advanced))

or

model.add(Activation(relu_advanced))

Unfortunately, you must hard code additional arguments. Therefore, it is better to use a function, that returns your function and passes your custom values:

def create_relu_advanced(max_value=1.):        
    def relu_advanced(x):
        return K.relu(x, max_value=K.cast_to_floatx(max_value))
    return relu_advanced

Then you can pass your arguments by either

model.add(Dense(512, input_dim=1, activation=create_relu_advanced(max_value=250)))

or

model.add(Activation(create_relu_advanced(max_value=250)))
Joy answered 22/3, 2017 at 14:31 Comment(2)
How do you load the model ? ... I seem to get error message Value error: unknown activation function:relu_advancedChaste
@Chaste have a look at my answer - it solves the reading issue, if it still matters for you...Saith
S
1

That is as easy as one lambda :

from keras.activations import relu
clipped_relu = lambda x: relu(x, max_value=3.14)

Then use it like this:

model.add(Conv2D(64, (3, 3)))
model.add(Activation(clipped_relu))

When reading a model saved in hdf5 use custom_objects dictionary:

model = load_model(model_file, custom_objects={'<lambda>': clipped_relu})
Saith answered 12/11, 2018 at 14:8 Comment(0)
S
1

Tested below, it'd work:

import keras

def clip_relu (x): 
    return keras.activations.relu(x, max_value=1.)

predictions=Dense(num_classes,activation=clip_relu,name='output')
Spoilt answered 16/7, 2019 at 17:22 Comment(0)
B
0

This is what I did using Lambda layer to implement clip relu: Step 1: define a function to do reluclip:

def reluclip(x, max_value = 20):
    return K.relu(x, max_value = max_value)

Step 2: add Lambda layer into model: y = Lambda(function = reluclip)(y)

Bramble answered 17/10, 2017 at 18:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.