Last chance for 25% off our online courses and all-access passes! EXPLORE OUR LIBRARY

Mini-batch gradient descent

The mini-batch gradient descent is a technique that combines properties from batch gradient descent and also stochastic gradient descent to optimize efficiency and accuracy of the gradient descent algorithm. In each iteration, a certain number of examples (a batch) within a data set will undergo batch gradient descent. This batch could be as small as 2 or larger than 200 examples. Compared to batch gradient descent it is significantly faster, and compared with stochastic gradient descent good vectorisation of the number of examples allows the computation to parallelised, hence it can perform faster than a stochastic gradient descent as well.

Artificial intelligence

Article information

rID: 61717
Tag: ai
Synonyms or Alternate Spellings:

ADVERTISEMENT: Supporters see fewers/no ads

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.