Ensemble learning
Making use of a number of weak learners to reduce variance and prevent overfitting
- Bagging: sample data, and train a learner on the sample. output is average (regression) or voting (classification) of all learners.
- Boosting: iteratively train models. each newly trained model tries to fix previous mistakes. might overfit
- Gradient descent in function space: http://maths.dur.ac.uk/~dma6kp/pdf/face_recognition/Boosting/Mason99AnyboostLong.pdf
Examples of ensemble learning methods:
Reading