How to implement RBF activation function in Keras?
Asked Answered
O

2

6

I am creating a customized activation function, RBF activation function in particular:

from keras import backend as K
from keras.layers import Lambda

l2_norm = lambda a,b:  K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))

def rbf2(x):
X = #here i need inputs that I receive from previous layer 
Y = # here I need weights that I should apply for this layer
l2 = l2_norm(X,Y)
res = K.exp(-1 * gamma * K.pow(l2,2))
return res

The function rbf2 receives the previous layer as input:

#some keras layers
model.add(Dense(84, activation='tanh')) #layer1
model.add(Dense(10, activation = rbf2)) #layer2

What should I do to get the inputs from layer1 and weights from layer2 to create the customized activation function?

What I am actually trying to do is, implementing the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidean distance between its input vector and its weight vector.

For example, layer1 has 84 neurons and layer2 has 10 neurons. In general cases, for calculating output for each of 10 neurons of layer2, we do the dot product of 84 neurons of layer1 and 84 weights in between layer1 and layer2. We then apply softmax activation function over it.

But here, instead of doing dot product, each neuron of the layer2 outputs the square of the Euclidean distance between its input vector and its weight vector (I want to use this as my activation function).

Any help on creating RBF activation function (calculating euclidean distance from inputs the layer receives and weights) and using it in the layer is also helpful.

Oneman answered 19/12, 2018 at 17:0 Comment(5)
Do you mean you want to get the output of layer1 and layer2 and pass it to your rbf function? If that's the case then are your sure it would work with the current definition of your activation function, since they have different shapes?Prolocutor
What I am actually trying to do is, implement the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidian distance between its input vector and its weight vector.Oneman
In short, I need the outputs of layer1 and weights of each neuron of layer2. and I want to calculate the Euclidean distance between them.Oneman
Is gamma a single parameter for each neuron or is it a vector? I think it is a single parameter.Prolocutor
Yes, it is a single parameter. I am using gamma = 1Oneman
P
9

You can simply define a custom layer for this purpose:

from keras.layers import Layer
from keras import backend as K

class RBFLayer(Layer):
    def __init__(self, units, gamma, **kwargs):
        super(RBFLayer, self).__init__(**kwargs)
        self.units = units
        self.gamma = K.cast_to_floatx(gamma)

    def build(self, input_shape):
        self.mu = self.add_weight(name='mu',
                                  shape=(int(input_shape[1]), self.units),
                                  initializer='uniform',
                                  trainable=True)
        super(RBFLayer, self).build(input_shape)

    def call(self, inputs):
        diff = K.expand_dims(inputs) - self.mu
        l2 = K.sum(K.pow(diff,2), axis=1)
        res = K.exp(-1 * self.gamma * l2)
        return res

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.units)

Example usage:

model = Sequential()
model.add(Dense(20, input_shape=(100,)))
model.add(RBFLayer(10, 0.5))
Prolocutor answered 20/12, 2018 at 10:45 Comment(2)
Comments are not for extended discussion; this conversation has been moved to chat.Foreordain
Can someone explain what gamma,kwargs , self.mu are?Grief
P
2

There is no need to reinvent the wheel here. A custom RBF layer for Keras already exists.

Pyroelectricity answered 26/9, 2019 at 12:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.