The output of a softmax isn't supposed to have zeros, right?
Asked Answered
C

1

5

I am working on a net in tensorflow which produces a vector which is then passed through a softmax which is my output.

Now I have been testing this and weirdly enough the vector (the one that passed through softmax) has zeros in all coordinate but one.

Based on the softmax's definition with the exponential, I assumed that this wasn't supposed to happen. Is this an error?

EDIT: My vector is 120x160 =192000. All values are float32

Carpophagous answered 23/8, 2016 at 19:56 Comment(0)
B
7

It may not be an error. You need to look at the input to the softmax as well. It is quite possible this vector has very negative values and a single very positive value. This would result in a softmax output vector containing all zeros and a single one value.

You correctly pointed out that the softmax numerator should never have zero-values due to the exponential. However, due to floating point precision, the numerator could be a very small value, say, exp(-50000), which essentially evaluates to zero.

Brunk answered 23/8, 2016 at 20:1 Comment(2)
You were kinda right. The value weren't too negative or too big but I added an lrn layer which reduced the values and normalised the whole thing. ( Although it roughly doubled my calculation time, and it might eventually be worse when I try to train the net on the GPU as lrn doesn't have a method for the GPU. )Carpophagous
How does one solve this problem, then?Horme

© 2022 - 2024 — McMap. All rights reserved.