Programme



Inscription Programme Venue Hébergement


S. Anthoine : Optimal Computational Trade-Off of Inexact Proximal Methods.



This is a joint work with Pierre Machart (LIF, Marseille).

Recent advances in machine learning and signal processing have led to more involved optimisation problems, while abundance of data calls for more efficient optimization algorithms. First-order methods are now extensively employed to tackle these issues and, among them, proximal-gradient algorithms are becoming increasingly popular. The heart of these procedures is the proximity operator. In the favorable cases, analytical forms exist. However, there are many problems where the proximity operator can only be computed numerically, giving rise to what can be referred to as inexact proximal-gradient algorithms. With those algorithms, one needs to set: a) the number of iterations of the precedure, b) the precision in the appoximation of the proximity operator, at each iteration. These quantities, which can be seen as hyper-parameters, are the object of study of this talk. Expressing the computational cost of inexact proximal-gradient algorithms, we derive a computationally optimal strategy to set those hyper-parameters.