What is the difference between a layer with a linear activation and a layer without activation?
Asked Answered
Z

2

13

I'm playing with Keras a little bit and I'm thinking about what is the difference between linear activation layer and no activation layer at all? Doesn't it have the same behavior? If so, what's the point of linear activation then?

I mean the difference between these two code snippets:

 model.add(Dense(1500))
 model.add(Activation('linear'))
 model.add(Dense(1500))

and

 model.add(Dense(1500))
 model.add(Dense(1500))
Zitvaa answered 3/5, 2019 at 7:21 Comment(0)
C
18

If you don't assign in Dense layer it is linear activation. This is from keras documentation.

activation: Activation function to use (see activations). If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x)

You can only add Activation if you want to use other than 'linear'.

model.add(Dense(1500))
model.add(Activation('relu'))
model.add(Dense(1500))
Coefficient answered 3/5, 2019 at 7:35 Comment(0)
B
6

You are right, there is no difference between your snippets: Both use linear activation.

The activation function determines if it is non-linear (e.g. sigmoid is a non-linear activation function):

model.add(Dense(1500))
model.add(Dense(1500, activation='sigmoid'))

7 Common Nonlinear Activation Functions and How to Choose an Activation Function

Bacteroid answered 3/5, 2019 at 7:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.