Univ. Heidelberg
Statistics Group   Institute for Mathematics   Faculty of Mathematics and Computer Science   University Heidelberg
Ruprecht-Karls-Universität Heidelberg Institute for Mathematics Statistics of inverse problems Research Group
german english french



Publications
Cooperations
Research projects
Events
Teaching
Completed theses
People
Contact


Last edited on
Apr 18, 2024 by JJ
.
Thesis:
Bachelor in Mathematics

Author:
Jakob Schulz

Title:
Optimal convergence of classifiers in statistical learning

Supervisor:
Jan JOHANNES

Abstract:
In statistical learning, Empirical Risk Minimization classifiers have been one approach to achieve the fastest possible convergence rates. These rates largely depend on two parameters: the complexity of the class of sets from which we choose our classifier and the margin parameter. This thesis will provide some insight into empirical process theory and based on this derive an upper bound for the rates of convergence depending on the parameters as well as a similar bound for the case where knowledge of these parameters is not given. It will then be shown that the thusly achieved rates cannot be improved in a minimax sense.

References:
A. B. Tsybakov. Optimal aggregation of classifiers in statistical learning. The Annals of Statistics, 32(1):135–166, 2003.