Seminar

Geometric Mechanics and Control Seminar

CM vs NAG: a variational approach

Speaker:  Cédric Martínez Campos (Universidad Rey Juan Carlos)
Date:  Friday, 05 March 2021 - 15:30
Place:  Online - zoom.us/j/93006808687?pwd=THpXbzhXTGJJY29KeXQxRTUvaGN1QT09 (ID: 930 0680 8687; Access code: 659820)

Abstract:
In 1964, Boris T. Polyak introduced a method, Classical Momentum (CM), to speed up the convergence of iteration methods that seek for the solution of the functional equation (P(x)=0), namely, it outperformed Gradient Descent (GD). Later, in 1983, Yuri Nesterov introduced a new method, Nesterov Accelerated Gradient (NAG), that further improved the convergence rate. Recently, during the last decade, NAG has raised new interest because of its extensive use in Machine Learning and was, in fact, one of the topics of a plenary lecture of the last ICM. In this talk, we will review both methods and show how they can be easily derived variationally, underlining their strong relation. Simulations will be performed to exemplify the methods.