Backpropagation (machine learning)
Citation, DOI, disclosures and article data
Citation:
Adams M, Gaillard F, Gajera J, et al. Backpropagation (machine learning). Reference article, Radiopaedia.org (Accessed on 03 Dec 2024) https://doi.org/10.53347/rID-56164
Permalink:
rID:
56164
Article created:
15 Oct 2017,
Matt Adams ◉
Disclosures:
At the time the article was created Matt Adams had no recorded disclosures.
View Matt Adams's current disclosures
Last revised:
Disclosures:
At the time the article was last revised Frank Gaillard had no recorded disclosures.
View Frank Gaillard's current disclosures
Revisions:
6 times, by
5 contributors -
see full revision history and disclosures
Sections:
Tags:
Backpropagation in supervised machine learning is the process used to calculate the gradient of the error function associated with each parameter weighting within a convoluted neural network (CNN). Essentially, the gradient estimates how the system parameters should change in order to optimise the network overall 1,2.
Quiz questions
{"containerId":"expandableQuestionsContainer","displayRelatedArticles":true,"displayNextQuestion":true,"displaySkipQuestion":true,"articleId":56164,"questionManager":null,"mcqUrl":"https://radiopaedia.org/articles/backpropagation-machine-learning/questions/1650?lang=gb"}
References
- 1. Nikhil Buduma, Nicholas Locascio. Fundamentals of Deep Learning. ISBN: 9781491925614
- 2. Stephen Marsland. Machine Learning. ISBN: 9781498759786
Incoming Links
Articles:
Multiple choice questions:
Related articles: Artificial intelligence
- artificial intelligence (AI)
- imaging data sets
- computer-aided diagnosis (CAD)
- natural language processing
- machine learning (overview)
- visualising and understanding neural networks
- common data preparation/preprocessing steps
- DICOM to bitmap conversion
- dimensionality reduction
- scaling
- centring
- normalisation
- principal component analysis
- training, testing and validation datasets
- augmentation
- loss function
-
optimisation algorithms
- ADAM
- momentum (Nesterov)
- stochastic gradient descent
- mini-batch gradient descent
-
regularisation
- linear and quadratic
- batch normalisation
- ensembling
- rule-based expert systems
- glossary
- activation function
- anomaly detection
- automation bias
- backpropagation
- batch size
- computer vision
- concept drift
- cost function
- confusion matrix
- convolution
- cross validation
- curse of dimensionality
- dice similarity coefficient
- dimensionality reduction
- epoch
- explainable artificial intelligence/XAI
- feature extraction
- federated learning
- gradient descent
- ground truth
- hyperparameters
- image dataset normalisation
- image registration
- imputation
- iteration
- jaccard index
- linear algebra
- noise reduction
- normalisation
- R (Programming language)
- radiomics quality score (RQS)
- Python (Programming language)
- segmentation
- semi-supervised learning
- synthetic and augmented data
- overfitting
- underfitting
- transfer learning