Hyperparameter (machine learning)

Last revised by Bálint Botz on 18 Oct 2020

Hyperparameters are specific aspects of a machine learning algorithm that are chosen before the algorithm runs on data. These hyperparameters are model specific e.g. they would typically include the number of epochs for a deep learning model or the number of branches in a decision tree model. The optimization of such parameters is a critical part of creating effective algorithms. This optimization process is often referred to tuning of the hyperparameters, and there has been significant work on the automation of it. The most widely used techniques in hyperparameter tuning are manual configuration, automated random search, and grid search 1. NB: Often the term parameter is used in the field of machine learning to imply only variables that can change during training.

 

ADVERTISEMENT: Supporters see fewer/no ads