AdaBoost (Adaptive Boosting)

KSEJZFLTMFCZ2ZK59YNU

AdaBoost, short for Adaptive Boosting, is a machine learning meta-algorithm formulated by Yoav Freund and Robert Schapire, who won the 2003 Gödel Prize for their work. It can be used in conjunction with many other types of learning algorithms to improve performance. The output of the other learning algorithms is combined into a weighted sum that represents the final output of the boosted classifier. AdaBoost is adaptive in the sense that subsequent weak learners are tweaked in favor of those instances misclassified by previous classifiers. In some problems it can be less susceptible to the overfitting problem than other learning algorithms. The individual learners can be weak, but as long as the performance of each one is slightly better than random guessing, the final model can be proven to converge to a strong learner.

Read more on Wikipedia

Have feedback on this skill? Let us know.

pattern

Job Postings Data

Top Companies Posting

Job Postings Analytics Loading Spinner

Top Job Titles

Job Postings Analytics Loading Spinner

Job Postings Trend

Job Postings Analytics Loading Spinner

Live Job Postings

Job Postings Analytics Loading Spinner