Batch size (machine learning)

Last revised by Andrew Murphy on 2 May 2019

Batch size is a term used in machine learning and refers to the number of training examples utilised in one iteration. The batch size can be one of three options:

  1. batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent
  2. mini-batch mode: where the batch size is greater than one but less than the total dataset size. Usually, a number that can be divided into the total dataset size.
  3. stochastic mode: where the batch size is equal to one. Therefore the gradient and the neural network parameters are updated after each sample.

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.