News

Why mini-batch gradient descent for Deep Learning Applications

machine learning
Instead of implementation of gradient descent on the entire training set, we can split our training set into smaller sets and implement gradient descent on each batch one after the other. It is called mini batch gradient descent and it makes the algorithm work faster especially for Deep Learning.
Want to leave a comment?