Université Paris 6
Pierre et Marie Curie | Université Paris 7
Denis Diderot | |

CNRS U.M.R. 7599
| ||

``Probabilités et Modèles Aléatoires''
| ||

**Auteur(s): **

**Code(s) de Classification MSC:**

- 62F35 Robustness and adaptive procedures
- 62J02 General nonlinear regression
- 94A17 Measures of information, entropy
- 62G07 Curve estimation (nonparametric regression, density estimation, etc.)

**Résumé:** We present an alternative to the penalized maximum likelihood
approach to model selection. Instead of penalizing the likelihood,
we consider its quantiles under some prior distribution on the
parameter set, in order to derive non-asymptotic deviation bounds for
some randomized estimators. This leads to a new measure of the
complexity of models which, unlike Vapnik's entropy,
is (in principle) computable from empirical observations.

**Mots Clés:** *Model selection ; pattern recognition ; least square regression ;
oracle inequalities ; deviation inequalities ; pseudo-Bayesian methods*

**Date:** 2001-07-02

**Prépublication numéro:** *PMA-677*