Implementing skip connections in keras
Asked Answered
G

3

33

I am implementing ApesNet in keras. It has an ApesBlock that has skip connections. How do I add this to a sequential model in keras? The ApesBlock has two parallel layers that merge at the end by element-wise addition.enter image description here

Gibbie answered 22/2, 2017 at 6:54 Comment(0)
S
42

The easy answer is don't use a sequential model for this, use the functional API instead, implementing skip connections (also called residual connections) are then very easy, as shown in this example from the functional API guide:

from keras.layers import merge, Convolution2D, Input

# input tensor for a 3-channel 256x256 image
x = Input(shape=(3, 256, 256))
# 3x3 conv with 3 output channels (same as input channels)
y = Convolution2D(3, 3, 3, border_mode='same')(x)
# this returns x + y.
z = merge([x, y], mode='sum')
Selfregard answered 22/2, 2017 at 12:20 Comment(6)
So, wouldn't it be an issue during backprop because y has the convolution's weights and z has has the new tensor?Gibbie
@Siddhartharao No, as this is all symbolic the gradients can be computed directly by TF/Theano.Selfregard
+1 why are they called residual connections? And what is the idea behind it? Can anyone help to improve my understanding?Interview
@Interview That question is better suited for stats.stackexchange.comSelfregard
@Interview Its not a hint, what you are asking is off-topic for Stack Overflow.Selfregard
@Interview meta.#291509Selfregard
I
18

I, too, couldn't find merge in the Keras documentation, as Dr.Snoopy says in their answer. And I get a type error 'module' object is not callable.

Instead I added an Add layer.

So the same example as Dr. Snoopy's answer would be:

from keras.layers import Add, Convolution2D, Input

# input tensor for a 3-channel 256x256 image
x = Input(shape=(3, 256, 256))
# 3x3 conv with 3 output channels (same as input channels)
y = Convolution2D(3, 3, 3, border_mode='same')(x)
# this returns x + y.
z = Add()([x, y])
Intisar answered 3/6, 2020 at 17:14 Comment(1)
Thank you. So this would do the residual block trick please?Gosselin
K
1

There is a simple way to use skip connections. This is an example from something i have been working on:

from keras.layers import Input, concatenate 
from keras.models import Model

def define_skip_model():
  
  input_net = Input((32,32,3))
  
  ## Encoder starts
  conv1 = Conv2D(32, 3, strides=(2,2), activation = 'relu', padding = 'same')(input_net)
  conv2 = Conv2D(64, 3, strides=(2,2), activation = 'relu', padding = 'same')(conv1)
  conv3 = Conv2D(128, 3, strides=(2,2), activation = 'relu', padding = 'same')(conv2)
  
  conv4 = Conv2D(128, 3, strides=(2,2), activation = 'relu', padding = 'same')(conv3)
  
  ## And now the decoder
  up1 = Conv2D(128, 3, activation = 'relu', padding = 'same')(UpSampling2D(size = (2,2))(conv4))
  merge1 = concatenate([conv3,up1], axis = 3)
  up2 = Conv2D(64, 3, activation = 'relu', padding = 'same')(UpSampling2D(size = (2,2))(merge1))
  merge2 = concatenate([conv2,up2], axis = 3)
  up3 = Conv2D(32, 3, activation = 'relu', padding = 'same')(UpSampling2D(size = (2,2))(merge2))
  merge3 = concatenate([conv1,up3], axis = 3)
  
  up4 = Conv2D(32, 3, padding = 'same')(UpSampling2D(size = (2,2))(merge3))
  
  output_net = Conv2D(3, 3, padding = 'same')(up4)
  
  model = Model(inputs = input_net, outputs = output_net)
  
  return model
Kiley answered 19/3, 2022 at 16:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.