I am working on a net in tensorflow which produces a vector which is then passed through a softmax which is my output.
Now I have been testing this and weirdly enough the vector (the one that passed through softmax) has zeros in all coordinate but one.
Based on the softmax's definition with the exponential, I assumed that this wasn't supposed to happen. Is this an error?
EDIT: My vector is 120x160 =192000. All values are float32