Citation, DOI, disclosures and article data
At the time the article was created Edward Chmiel had no recorded disclosures.View Edward Chmiel's current disclosures
At the time the article was last revised Andrew Murphy had no recorded disclosures.View Andrew Murphy's current disclosures
Boosting is an ensemble technique that creates increasingly complex algorithms from building blocks of relatively simple decision rules for binary classification tasks. This is achieved by sequentially training new models (or 'weak' learners) which focus on examples that were classified incorrectly by previous classifiers. These weak learners are then converted into a strong learner by taking a weighted vote of the decisions made by the weak learners.
This method is strongest when there is a minimal correlation between each of the component weak learners – that is, that the errors of the weak learners occur in different circumstances. This is achieved by sequentially training new learners with an increased penalty for misclassifying those cases which were incorrectly classified by previous learners.
Boosting in radiology
Suppose there are three algorithms designed to detect consolidation on chest x-ray, A, B and C. Let algorithm A be accurate except for when the radiograph is over-exposed, algorithm B be accurate except for when the patient is rotated, and algorithm C misclassify all atelectasis as consolidation. A simple model which uses a majority vote of these component models would be more accurate than each of algorithm A, B and C in isolation. For example, if the film is over-exposed and algorithm A misclassifies the chest x-ray, the ultimate decision will be outvoted by algorithms B and C which will both vote for the correct diagnosis.
- 1. NG A. "Machine Learning | Coursera". Coursera, 2018. [Link].