Univ. Heidelberg
Statistics Group   Institute for Mathematics   Faculty of Mathematics and Computer Science   University Heidelberg
Ruprecht-Karls-Universität Heidelberg Institute for Mathematics Statistics of inverse problems Research Group
german english french



Publications
Cooperations
Research projects
Events
Teaching
Completed theses
People
Contact


Last edited on
Apr 18, 2024 by JJ
.
Thesis:
Inauguraldissertation zur Erlangung der Doktorwürde der Naturwissenschaftlich-Mathematischen Gesamtfakultät der Ruprecht-Karls-Universität Heidelberg
urn:nbn:de:bsz:16-heidok-292558

Author:
Sandra Schluttenhofer (Heidelberg University)

Title:
Adaptive minimax testing for inverse problems

Supervisor and examiner:
Jan JOHANNES

Second examiners:
Enno Mammen (Heidelberg University)
Clément Marteau (Université Lyon I)

Abstract:
This thesis deals with non-parametric hypothesis testing for ill-posed inverse problems, where optimality is measured in a non-asymptotic minimax sense. Loosely speaking, we observe only an approximation of a transformed version of the quantity of interest. Statistical inference, which usually requires an inversion of the transformation, is thus an inverse problem. Particularly challenging are ill-posed inverse problems, where the inverse transformation is not stable. The thesis is divided into two parts, which investigate different ill-posed inverse models: the inverse Gaussian sequence space model with partially unknown operator and a circular convolution model. In both models we derive minimax separation radii of testing, which characterise how much an object has to differ from the null hypothesis to be detectable by a statistical test. We propose two types of testing procedures, an indirect and a direct one. The indirect test is based on a projection-type estimation of the distance to the null and we prove its minimax optimality under mild assumptions. The direct test is instead based on estimating the energy in the image space and thus avoids an inversion of the operator. We highlight the situations in which also the direct test performs optimally. As usual in non-parametric statistics, the performance of our tests depends on the optimal choice of a dimension parameter, which relies on prior knowledge of the underlying structure of the model. We derive adaptive testing strategies by applying a classical Bonferroni aggregation to both the direct and the indirect testing procedures and analyse their performance. Compared with the non-adaptive tests their radii face a deterioration by a log-factor, which we show to be an unavoidable cost to pay for adaptation. Since our minimax optimal testing procedures are based on estimators of a quadratic functional, we further explore the connection between the two problems – quadratic functional estimation and minimax testing – in the circular convolution model. We show how results from one framework can be exploited in the other. Lastly, we consider minimax testing under privacy constraints, where the observations are deliberately transformed before being released to the statistician in order to protect the privacy of an individual.