Sebastian Fodor: Understanding Convergence of Accelerated Gradient Descent Methods
BSc thesis presentation
Tid: On 2020-06-03 kl 08.30 - 09.30
Föreläsare: Sebastian Fodor
Handledare: Yishao Zhou
In this paper we study a generalized version of Nesterovs Accelerated Gradient Method (NAG) that has three parameters instead of two, with one additional parameter for the amount of gradient correction. This gives the Generalized Nesterovs Accelerated Gradient Method (GNAG), of which both NAG and Polyak’s heavy-ball method are special cases. We derive a differential equation that approximates this method and show that it converges with linear rate for functions of strong convexity and Lipschitz continuous gradients. We also consider GNAG as a dynamical system, and show that this dynamical system converges with linear rate to a steady state which is the optimal solution. We ask the question whether GNAG converges faster than NAG for certain choices of the gradient correction parameter, and by numerical examples arrive at the conclusion that a higher gradient correction parameter can lead to faster convergence.