Université Paris 6
Pierre et Marie Curie | Université Paris 7
Denis Diderot | |

CNRS U.M.R. 7599
| ||

``Probabilités et Modèles Aléatoires''
| ||

**Auteur(s): **

**Code(s) de Classification MSC:**

- 60G99 None of the above, but in this section
- 62C12 Empirical decision procedures; empirical Bayes procedures
- 62G99 None of the above, but in this section

**Résumé:** The probability of error of classification methods based on convex
combinations of simple base classifiers by "boosting" algorithms
is investigated. The main result of the paper is that certain
regularized boosting algorithms provide Bayes-risk consistent
classifiers under the only assumption that the Bayes classifier
may be approximated by a convex combination of the base
classifiers. Non-asymptotic distribution-free bounds are also
developed which offer interesting new insight into how boosting
works and help explain its success in practical classification
problems.

**Mots Clés:** *boosting ; overfitting ; data classification ; Bayes-risk consistency ;
regularized methods ; convex cost functions ; penalized model selection ; empirical processes*

**Date:** 2003-03-06

**Prépublication numéro:** *PMA-801*

**Postscript file : **PMA-801.ps