[go: up one dir, main page]

0% found this document useful (0 votes)
153 views14 pages

IE506 Bagging Boosting April5 6

The document discusses machine learning classification algorithms bagging and boosting, including how bagging works by building multiple models on random subsets of data and averaging predictions, and how boosting such as AdaBoost works by iteratively training weak learners while focusing on misclassified examples.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views14 pages

IE506 Bagging Boosting April5 6

The document discusses machine learning classification algorithms bagging and boosting, including how bagging works by building multiple models on random subsets of data and averaging predictions, and how boosting such as AdaBoost works by iteratively training weak learners while focusing on misclassified examples.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Machine Learning: Principles and Techniques

Bagging, Boosting
IE 506

April 5, 2024

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 1 / 14


Outline

1 Classification Algorithms
Bagging
Boosting

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 2 / 14


Classification Algorithms Bagging

Classification Algorithms: Bagging

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 3 / 14


Classification Algorithms Bagging

Bagging

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 4 / 14


Classification Algorithms Bagging

Bagging

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 5 / 14


Classification Algorithms Bagging

Bagging

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 6 / 14


Classification Algorithms Boosting

Classification Algorithms: Boosting

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 7 / 14


Classification Algorithms Boosting

Boosting: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 8 / 14


Classification Algorithms Boosting

AdaBoost - a loss perspective†


Input: N samples {(x i , y i )}N i d
i=1 , x ∈ R ,
i
y ∈ {+1, −1}, ∀i ∈ {1, 2, . . . , N}.

Initialize weights wi1 = 1/N, ∀i ∈ {1, 2, . . . , N}.


For t = 1, 2, . . . , T do:
▶ Train a weak classifier with examples weighed using current weights wit
PN
by minimizing: ϵt = i=1 wit I(ht (x i ) ̸= y i ).
▶ Compute αt = 1
2 ln 1−ϵ
ϵt
t

i i
▶ Update weights as: wit+1 = wit e −αt y h(x )
PN
▶ Normalize wit+1 = wit+1 / i=1 wit+1 .

Output: Final classifier h(x) = sign( T


P
t=1 αt ht (x)).


:J. Friedman, T. Hastie and R. Tibshirani. Additive logistic regression: A statistical
view of Boosting, Annals of Statistics, 2000, Vol. 28, no. 2, pp. 337–407.
P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 9 / 14
Classification Algorithms Boosting

Bagging: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 10 / 14


Classification Algorithms Boosting

Bagging: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 11 / 14


Classification Algorithms Boosting

Bagging: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 12 / 14


Classification Algorithms Boosting

Bagging: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 13 / 14


Classification Algorithms Boosting

Bagging: AdaBoost

P. Balamurugan Machine Learning: Principles and Techniques April 5, 2024 14 / 14

You might also like