binary threshold activation function in tensorflow
Asked Answered
D

2

6

I have a piece of code that uses sigmoid activation function for classification that outputs [0,1]. But I need a activation function that outputs binary values either 0 or 1.

        x = tf.placeholder("float", [None, COLUMN])
        Wh = tf.Variable(tf.random_normal([COLUMN, UNITS_OF_HIDDEN_LAYER], mean=0.0, stddev=0.05))
        h = tf.nn.sigmoid(tf.matmul(x, Wh))

        Wo = tf.Variable(tf.random_normal([UNITS_OF_HIDDEN_LAYER, COLUMN], mean=0.0, stddev=0.05))
        y = tf.nn.sigmoid(tf.matmul(h, Wo))

        # Objective functions
        y_ = tf.placeholder("float", [None, COLUMN])
        correct_prediction = tf.equal(tf.argmax(y, 1),tf.argmax(y, 1))
        cost = tf.reduce_sum(tf.cast(correct_prediction, "float"))/BATCH_SIZE

Can you please tell me how to replace sigmoid function with binary step one.

Dissimilarity answered 23/9, 2017 at 0:46 Comment(0)
A
11
y = tf.round(tf.nn.sigmoid(tf.matmul(h,Wo))

that will give you 0 or 1 output.

Adamok answered 23/9, 2017 at 12:47 Comment(0)
G
6

You don't need sigmoid in this case. Try relu(sign(x))

Glycol answered 11/10, 2017 at 17:39 Comment(2)
If you don't use a sigmoid would you then still be able to interpret it as a probability?Helyn
also used in BinaryNet github.com/itayhubara/BinaryNet.tf/blob/master/nnUtils.pyDollar

© 2022 - 2024 — McMap. All rights reserved.