Batch size (machine learning)
Batch size is a term used in machine learning and refers to the number of training examples utilised in one iteration. The batch size can be one of three options:
 batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent
 minibatch mode: where the batch size is greater than one but less than the total dataset size. Usually, a number that can be divided into the total dataset size.
 stochastic mode: where the batch size is equal to one. Therefore the gradient and the neural network parameters are updated after each sample.
Related articles
Machine learning
 artificial intelligence
 computer aided diagnosis (CAD)

machine learning (overview)
 types of machine learning
 models
 linear regression
 logistic regression
 decision tree
 random forest
 support vector machine
 neural network (overview)
 common data preparation/preprocessing steps
 DICOM to bitmap
 scaling/centering
 normalisation
 principal component analysis
 train/test/validation split
 augmentation
 loss functions
 mean squared error
 cross entropy
 optimisation algorithms
 stochastic gradient descent
 momentum (Nesterov)
 ADAM
 regularisation
 linear and quadratic
 batch normalization
 ensembling
 boosting
 bagging
 rulebased expert systems
 glossary