Datalab Seminar

Optimization Algorithms and their Optimality for some Structured Problems

Speaker:  David Martínez Rubio (Universidad Carlos III de Madrid)
Date:  Thursday, 03 April 2025 - 12:00
Place:  Aula Gris 1, ICMAT

Abstract:

Nesterov's accelerated gradient descent is one of the most celebrated methods in continuous optimization. It exploits the gradient Lipschitzness of a convex function obtaining optimal rates of convergence for its minimization problem. Motivated by applications in Logistic Regression, Optimal Transport, \(\ell_p\)-regression, Distributionally Robust Optimization, Hard-Margin SVM, or Box-Simplex Games, there has been some recent efforts to understand how much the acceleration phenomenon can be exploited in other settings. In particular, in some settings with structure, one can obtain much faster convergence. In many of these, exploiting the function's regularity with respect to a non-Euclidean norm is essential. I will go over a general optimization algorithm with guarantees, whose convergence rate is optimal in these settings even against randomized algorithms or parallel ones that use polynomial-in-the-dimension queries to a local oracle of the function at each iteration.

EVENTS

1234
567891011
12131415161718
19202122232425
262728293031


Subscribe to our Activities mailing list. Subscribe - Unsubscribe

Pequeño Instituto de Matemáticas

PIM

ICMAT in the elpais.es