The following Matlab project contains the source code and Matlab examples used for a logitboost implementation.
Codes for the the so-called AOSO-LogitBoost, which is an up-to-date (yet state-of-the-art, probably ) implementation of Friedman's LogitBoost for multi-class classification.

AdaBoost, short for "Adaptive Boosting", is a machine learning meta-algorithm formulated by Yoav Freund and Robert Schapire who won the prestigious "Gödel Prize" in 2003 for their work.It can be used in conjunction with many other types of learning algorithms to improve their performance.

The following Matlab project contains the source code and Matlab examples used for multiclass gentleadaboosting.
Gentle AdaBoost Classifier with two different weak-learners : Decision Stump and Perceptron.

The following Matlab project contains the source code and Matlab examples used for classic adaboost classifier.
This a classic AdaBoost implementation, in one single file with easy understandable code.

The following Matlab project contains the source code and Matlab examples used for adaboost: the meta machine learning algorithm formulated by yoav freund and robert schapire .
AdaBoost, Adaptive Boosting, is a well-known meta machine learning algorithm that was proposed by Yoav Freund and Robert Schapire.

The following Matlab project contains the source code and Matlab examples used for classic adaboost classifier.
This a classic AdaBoost implementation, in one single file with easy understandable code.

The following Matlab project contains the source code and Matlab examples used for adaboost: the meta machine learning algorithm formulated by yoav freund and robert schapire .
AdaBoost, Adaptive Boosting, is a well-known meta machine learning algorithm that was proposed by Yoav Freund and Robert Schapire.