David Ek: Approaches to accelerate methods for solving systems of equations arising in nonlinear optimization
Time: Fri 2020-12-11 11.00 - 12.00
Location: Zoom Meeting ID: 636 5838 1373
Lecturer: David Ek
Supervisor: Anders Forsgren
In this pre-defense seminar, I will present selected parts of my upcoming thesis. Methods for solving nonlinear optimization problems typically involve solving systems of equations. The thesis concerns approaches for accelerating some of those methods. In our setting, accelerating involves finding a trade-off between the computational cost of an iteration and the quality of the computed search direction. We have designed approaches for which theoretical results in ideal settings have been derived. We have also investigated the practical performance of the approaches within and beyond the boundaries of the theoretical frameworks with numerical simulations.
The initial part concerns solving strictly convex unconstrained quadratic optimization problems. In particular, exact linesearch limited-memory quasi-Newton methods which generate search directions parallel to those of the method of preconditioned conjugate gradients. The focus of the second part is approaches to accelerate primal-dual interior-point methods. In particular, approaches when the method is applied to bound-constrained nonlinear optimization problems and on quadratic optimization problems with linear inequality constraints.
The thesis abstract and an electronic version of the thesis can be found at: