Skip to main content

André Schwalbe Lehtihet: A Mathematical and Numerical Study on Generalization Error Estimates for Neural Network Approximations

Time: Tue 2022-06-07 15.00 - 15.30

Location: KTH Lindstedsvägen 25, Room 3424

Respondent: André Schwalbe Lehtihet

Supervisor: Anders Szepessy

Export to calendar

Abstract:

Machine learning has since its introduction become an increasingly more common tool used for solving a wide array of problems ranging from regression problems to speech recognition. The core idea of machine learning is that of a neural network, a function consisting of the alternating application of affine maps and non-linear maps called acti- vation functions. In this thesis we have studied generalization error estimates for certain types of neural networks and have tested these bounds numerically. This has been done by reducing the process of training the network to that of sampling in an optimal way frequencies defining the network, and reduced the training of the remaining parameters to an easy to optimize convex problem. We found that the derived error bounds were not sharp and that the bounds found empirically by using the optimal sampling could be well matched by a method based on Metropolis sampling.