I'm following udacity MNIST tutorial and MNIST data is originally 28*28
matrix. However right before feeding that data, they flatten the data into 1d array with 784 columns (784 = 28 * 28)
.
For example,
original training set shape was (200000, 28, 28).
200000 rows (data). Each data is 28*28 matrix
They converted this into the training set whose shape is (200000, 784)
Can someone explain why they flatten the data out before feeding to tensorflow?