activation-function Questions
2
Solved
As I understand it, in a deep neural network, we use an activation function (g) after applying the weights (w) and bias(b) (z := w * X + b | a := g(z)). So there is a composition function of (g o z...
Thesda asked 21/9, 2018 at 15:20
5
Solved
I am trying to implement leaky Relu, the problem is I have to do 4 for loops for a 4 dimensional array of input.
Is there a way that I can do leaky relu only using Numpy functions?
Boresome asked 24/5, 2018 at 20:20
4
Solved
This may be a very basic/trivial question.
For Negative Inputs,
Output of ReLu Activation Function is Zero
Output of Sigmoid Activation Function is Zero
Output of Tanh Activation Function is -...
Spiccato asked 27/2, 2020 at 15:34
3
Solved
I have recently been reading the Wavenet and PixelCNN papers, and in both of them they mention that using gated activation functions work better than a ReLU. But in neither cases they offer an expl...
Devout asked 9/5, 2019 at 14:18
3
Solved
I'm having issues with implementing custom activation functions in Pytorch, such as Swish. How should I go about implementing and using custom activation functions in Pytorch?
Loo asked 19/4, 2019 at 17:0
5
Solved
How to change the activation layer of a Pytorch pretrained network?
Here is my code :
print("All modules")
for child in net.children():
if isinstance(child,nn.ReLU) or isinstance(child,nn.SELU)...
Kaunas asked 9/10, 2019 at 4:44
2
Solved
How would I implement the derivative of Leaky ReLU in Python without using Tensorflow?
Is there a better way than this? I want the function to return a numpy array
def dlrelu(x, alpha=.01):
# re...
Tintoretto asked 4/1, 2018 at 20:12
3
Is there a logit function in tensorflow, i.e. the inverse of sigmoid function? I have searched google but have not found any.
Farthest asked 11/6, 2018 at 9:14
1
Solved
According to the discussions on PyTorch forum :
What’s the difference between nn.ReLU() and nn.ReLU(inplace=True)?
Guidelines for when and why one should set inplace = True?
The purpose of inplac...
Rangoon asked 10/11, 2021 at 13:4
2
Solved
I have a question regarding appropriate activation functions with environments that have both positive and negative rewards.
In reinforcement learning, our output, I believe, should be the expecte...
Argentinaargentine asked 26/12, 2017 at 14:35
2
Solved
I am struggling to implement an activation function in tensorflow in Python.
The code is the following:
def myfunc(x):
if (x > 0):
return 1
return 0
But I am always getting the error:
Usin...
Reflexion asked 1/2, 2018 at 20:51
5
Solved
Most examples of neural networks for classification tasks I've seen use the a softmax layer as output activation function. Normally, the other hidden units use a sigmoid, tanh, or ReLu function as ...
Kami asked 2/6, 2016 at 10:1
3
Solved
I have been experimenting with neural networks these days. I have come across a general question regarding the activation function to use. This might be a well known fact to but I couldn't understa...
Russ asked 11/10, 2017 at 5:28
1
Solved
For a CNN architecture I want to use SpatialDropout2D layer instead of Dropout layer.
Additionaly I want to use BatchNormalization.
So far I had always set the BatchNormalization directly after a C...
Stamp asked 7/1, 2020 at 19:20
2
Solved
I am working on Keras in Python and I have a neural network (see code below).
Currently it works with only a ReLu activation.
For experimental reasons I would like to have some neurons on ReLu an...
Steamer asked 12/12, 2017 at 11:59
2
Solved
I am using the Sequential model from Keras, with the DENSE layer type. I wrote a function that recursively calculates predictions, but the predictions are way off. I am wondering what is the best a...
Tried asked 8/11, 2019 at 6:2
2
Solved
I am creating a customized activation function, RBF activation function in particular:
from keras import backend as K
from keras.layers import Lambda
l2_norm = lambda a,b: K.sqrt(K.sum(K.pow((a-b...
Oneman asked 19/12, 2018 at 17:0
1
Solved
In pytorch a classification network model is defined as this,
class Net(torch.nn.Module):
def __init__(self, n_feature, n_hidden, n_output):
super(Net, self).__init__()
self.hidden = torch.nn.L...
Sorrento asked 15/8, 2019 at 20:43
1
I'm trying to write a custom activation function for use with Keras. I can not write it with tensorflow primitives as it does properly compute the derivative. I followed How to make a custom activa...
Tutti asked 13/8, 2019 at 1:16
4
Relu function as defined in keras/activation.py is:
def relu(x, alpha=0., max_value=None):
return K.relu(x, alpha=alpha, max_value=max_value)
It has a max_value which can be used to clip the v...
Combatant asked 20/12, 2016 at 22:54
2
Solved
I'm playing with Keras a little bit and I'm thinking about what is the difference between linear activation layer and no activation layer at all? Doesn't it have the same behavior? If so, what's th...
Zitvaa asked 3/5, 2019 at 7:21
1
Solved
I am trying to figure out how to match activation=sigmoid and activation=softmax with the correct model.compile() loss parameters. Specifically those associated with binary_crossentropy.
I have re...
Unfinished asked 30/4, 2019 at 22:22
1
I can find a list of activation functions in math but not in code.
So i guess this would be the right place for such a list in code if there ever should be one.
starting with the translation of the...
Lipscomb asked 3/4, 2016 at 10:23
1
I've written an LSTM model using Keras, and using LeakyReLU advance activation:
# ADAM Optimizer with learning rate decay
opt = optimizers.Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-08...
Intimidate asked 31/10, 2018 at 9:35
1
Solved
I am following the official TensorFlow with Keras tutorial and I got stuck here: Predict house prices: regression - Create the model
Why is an activation function used for a task where a continuou...
Muir asked 20/7, 2018 at 12:21
1 Next >
© 2022 - 2024 — McMap. All rights reserved.