Loading...
See Mini-batches, Momentum, RMSProp and AdamOptimizer: an overview of optimization algorithm to train faster neural networks.