Univ. Heidelberg
Statistics Group   Institute for Mathematics   Faculty of Mathematics and Computer Science   University Heidelberg
Ruprecht-Karls-Universität Heidelberg Institute for Mathematics Statistics of inverse problems Research Group
german english french



Publications
Cooperations
Research projects
Events
Teaching
Completed theses
People
Contact


Last edited on
Apr 18, 2024 by JJ
.
Thesis:
Bachelor in Mathematics 50%

Author:
Melissa Weber

Title:
Lineare Einfachregression insbesondere in höherdimensionalen Modellen

Supervisors:
Sergio Brenner Miguel
Jan JOHANNES

Abstract:
In the following elaboration, the linear simple regression is introduced and extended with input data of higher dimension. The focus will be on the distinction between deterministic and stochastic input data. That means the first two chapters are considered separately for fixed and random data, so that the respective advantages and also limitations are clearly expressed. A large chapter deals with the possibility to determine the estimators in a (multiple) linear regression using the least squares method and concludes with theoretical results such as the Gauß-Markov theorem, which shows that the least squares estimator is the best linear unbiased estimator. This chapter can be applied to all dimensions of input data and divides into fixed and random design only for certain inferences. Finally, ridge regression is introduced with focus on graphical visualization as well as implementation in the statistical software R. This regression is interesting as it shows some advantages over least squares estimator especially in high dimensions. An outlook at the end of the paper is intended to stimulate further interest in the topic and to give an idea for its possible continuation.

References:
S. Richter. Statistisches und maschinelles Lernen, Springer Spektrum, Berlin, 2019.