Deep Learning Techniques (Deep Neural Network, Deep Belief Network, Deep Stacking Networks, ...) are very efficient in some areas. They take a very long time to train, but this is a only-once cost.
I read several papers about different techniques and they only focused on accuracy and time to train them. How fast are they to produce an answer in practice, once trained ?
Are there some data available on benchmarking deep networks with perhaps millions of parameters ?
I would think that they are quite fast as all the weights are fixed, but as the functions can be quite complex and the number of parameters quite high, I'm not sure on how they really perform in practice.