Reflection padding Conv2D
Asked Answered
R

5

6

I'm using keras to build a convolutional neural network for image segmentation and I want to use "reflection padding" instead of padding "same" but I cannot find a way to to do it in keras.

inputs = Input((num_channels, img_rows, img_cols))
conv1=Conv2D(32,3,padding='same',kernel_initializer='he_uniform',data_format='channels_first')(inputs)

Is there a way to implement a reflection layer and insert it in a keras model ?

Remmer answered 4/6, 2018 at 9:37 Comment(0)
R
3

Found the solution! We have only to create a new class that takes a layer as input and use tensorflow predefined function to do it.

import tensorflow as tf
from keras.engine.topology import Layer
from keras.engine import InputSpec

class ReflectionPadding2D(Layer):
    def __init__(self, padding=(1, 1), **kwargs):
        self.padding = tuple(padding)
        self.input_spec = [InputSpec(ndim=4)]
        super(ReflectionPadding2D, self).__init__(**kwargs)

    def get_output_shape_for(self, s):
        """ If you are using "channels_last" configuration"""
        return (s[0], s[1] + 2 * self.padding[0], s[2] + 2 * self.padding[1], s[3])

    def call(self, x, mask=None):
        w_pad,h_pad = self.padding
        return tf.pad(x, [[0,0], [h_pad,h_pad], [w_pad,w_pad], [0,0] ], 'REFLECT')

# a little Demo
inputs = Input((img_rows, img_cols, num_channels))
padded_inputs= ReflectionPadding2D(padding=(1,1))(inputs)
conv1 = Conv2D(32, 3, padding='valid', kernel_initializer='he_uniform',
               data_format='channels_last')(padded_inputs)
Remmer answered 4/6, 2018 at 11:27 Comment(3)
But shouldn't the padding come before the convolution in the demo?Filmore
That did now work for me. still had the same shape afterwardsWeek
@ChristofHenkel check out my answer below. You should change the function name get_output_shape_for to compute_output_shape.Labuan
L
10

The accepted answer above is not working in the current Keras version. Here is the version that's working:

class ReflectionPadding2D(Layer):
    def __init__(self, padding=(1, 1), **kwargs):
        self.padding = tuple(padding)
        self.input_spec = [InputSpec(ndim=4)]
        super(ReflectionPadding2D, self).__init__(**kwargs)

    def compute_output_shape(self, s):
        """ If you are using "channels_last" configuration"""
        return (s[0], s[1] + 2 * self.padding[0], s[2] + 2 * self.padding[1], s[3])

    def call(self, x, mask=None):
        w_pad,h_pad = self.padding
        return tf.pad(x, [[0,0], [h_pad,h_pad], [w_pad,w_pad], [0,0] ], 'REFLECT')
Labuan answered 17/11, 2018 at 9:41 Comment(5)
jeevaa_v, can you explain a but about what you wrote? even a reference article which explains what reflection layers do, would be greatWo
I have just created a custom layer in Keras using tensorflow's reflection padding. Regarding reflection padding or other kind of padding, see here: www-cs.engr.ccny.cuny.edu/~wolberg/cs470/hw/hw2_pad.txtLabuan
This implementation is helpful and clean, thanks for sharing!Marinna
There is a big problem here if you use padding larger than (1,1) and save your model. when you load it, it will try to do (1,1) padding because it is initialized that way in the init and the model will fail to load. I can't figure a way around it so I'd use the Lambda solution below if you have this problem.Ailee
machinecurve.com/index.php/2020/02/10/…Jacobo
R
3

Found the solution! We have only to create a new class that takes a layer as input and use tensorflow predefined function to do it.

import tensorflow as tf
from keras.engine.topology import Layer
from keras.engine import InputSpec

class ReflectionPadding2D(Layer):
    def __init__(self, padding=(1, 1), **kwargs):
        self.padding = tuple(padding)
        self.input_spec = [InputSpec(ndim=4)]
        super(ReflectionPadding2D, self).__init__(**kwargs)

    def get_output_shape_for(self, s):
        """ If you are using "channels_last" configuration"""
        return (s[0], s[1] + 2 * self.padding[0], s[2] + 2 * self.padding[1], s[3])

    def call(self, x, mask=None):
        w_pad,h_pad = self.padding
        return tf.pad(x, [[0,0], [h_pad,h_pad], [w_pad,w_pad], [0,0] ], 'REFLECT')

# a little Demo
inputs = Input((img_rows, img_cols, num_channels))
padded_inputs= ReflectionPadding2D(padding=(1,1))(inputs)
conv1 = Conv2D(32, 3, padding='valid', kernel_initializer='he_uniform',
               data_format='channels_last')(padded_inputs)
Remmer answered 4/6, 2018 at 11:27 Comment(3)
But shouldn't the padding come before the convolution in the demo?Filmore
That did now work for me. still had the same shape afterwardsWeek
@ChristofHenkel check out my answer below. You should change the function name get_output_shape_for to compute_output_shape.Labuan
W
2
import tensorflow as tf
from keras.layers import Lambda

inp_padded = Lambda(lambda x: tf.pad(x, [[0,0], [27,27], [27,27], [0,0]], 'REFLECT'))(inp)

The solution from Akihiko did not work with the new keras version, so I came up with my own. The snippet pads a batch of 202x202x3 images to 256x256x3

Week answered 24/8, 2018 at 17:51 Comment(1)
really simple and effective solutionPigfish
F
0

As you can check in the documentation there is no such 'reflect' padding. Only 'same' and 'valid' are implemented in keras.

You maybe try to implemented on your own or find if somebody already did it. You should base yourself in the Conv2D class and check where self.padding member variable is used.

Flyblown answered 4/6, 2018 at 9:44 Comment(2)
I know that this is not yet implemented in keras. But, how to implement a reflect layer and add it to a keras model ?Remmer
Then you should clarify your question, to say that you want to implement padding by reflection.Flyblown
V
0

The accepted answer does not work if we have undefined dimensions! There will be an error when compute_output_shape function is called. Here is the simple work around to that.

class ReflectionPadding2D(Layer):
    def __init__(self, padding=(1, 1), **kwargs):
        self.padding = tuple(padding)
        self.input_spec = [InputSpec(ndim=4)]
        super(ReflectionPadding2D, self).__init__(**kwargs)

    def compute_output_shape(self, s):
        if s[1] == None:
            return (None, None, None, s[3])
        return (s[0], s[1] + 2 * self.padding[0], s[2] + 2 * self.padding[1], s[3])

    def call(self, x, mask=None):
        w_pad, h_pad = self.padding
        return tf.pad(x, [[0, 0], [h_pad, h_pad], [w_pad, w_pad], [0, 0]], 'REFLECT')

    def get_config(self):
        config = super(ReflectionPadding2D, self).get_config()
        print(config)
        return config
Veradia answered 7/2, 2020 at 15:18 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.