Univ. Heidelberg
Statistics Group   Institute of Applied Mathematics   Faculty of Mathematics and Computer Science   University Heidelberg
Ruprecht-Karls-Universität Heidelberg Institute of Applied Mathematics Statistics of inverse problems Research Group
german english french

Research projects
Completed theses

Last edited on
Oct 18, 2021 by JJ
Bachelor in Mathematics

Konstantin Klumpp

Adaptive posterior concentration rates under sup norm loss


The content of this bachelor thesis is based on the results of Hoffmann et al. (2015). After a brief introduction to the asymptotic theory in nonparametric Bayesian statistics and the Gaussian sequence model, we present a lower bound on the expectation of the posterior mass on complements of neighborhoods of the “true” parameter. This bound, as all the theorems in Hoffmann et al. (2015), relies on an accurate description of the interplay between the loss function l and a premetric d which controls the likelihood ratio on the parameter space in a certain way. Having proven that the necessary conditions are satisfied in the Gaussian sequence model, we construct two different priors in this model, with the sup norm as loss function, which achieve the lower bound up to a constant in the exponent. By their own account, such an optimal construction has never been done in this model before Hoffmann et al. (2015). One of the two priors is a so-called sieve prior, which is concentrated on a finite subset of the parameter space. For this kind of prior, we will also derive an upper bound, similar to the lower bound above, under more general conditions.

M. Hoffmann, J. Rousseau, and J. Schmidt-Hieber. On adaptive posterior concentration rates. The Annals of Statistics, 43(5):2259–2295, 2015