# Week 6 - Optimization Algorithms

### This week you will..

See Mini-batches, Momentum, RMSProp and AdamOptimizer: an overview of optimization algorithm to train faster neural networks.

### Learning Resources

### Slides

{% embed url="<https://deeplearning.freelab.org/dl1/weeks/week7.html>" %}
