The LOS Seminar is the working seminar of the LOS research center.
The LOS Seminar is the working seminar of the LOS research center.
The seminars are online. All seminars, except where otherwise indicated, will be Tuesdays at 14:00.
To receive announcements about the seminar, please send an email to los@fmi.unibuc.ro.
Laurențiu Leuștean (LOS and IMAR)
Proof mining and applications in nonlinear analysis and convex optimization. Part II
Laurențiu Leuștean (LOS and IMAR)
Proof mining and applications in nonlinear analysis and convex optimization
Abstract: The research program of proof mining is concerned with the extraction of hidden finitary content
from proofs that make use of highly infinitary principles. The new information is obtained
after a logical analysis, using proof-theoretic tools, and can be both of quantitative nature,
such as algorithms and effective bounds, as well as of qualitative nature, such as uniformities
in the bounds or weakening the premises. Thus, even if one is not particularly interested in
the numerical details of the bounds themselves, in many cases such explicit bounds immediately
show the independence of the quantity in question from certain input data. This line of research,
developed by Ulrich Kohlenbach in the 1990's, has its roots in Georg Kreisel's program on
unwinding of proofs, put forward in the 1950's. Proof mining can be related to Terence Tao's
proposal of hard analysis, based on finitary arguments, instead of the infinitary ones from soft analysis.
In this talk we give an introduction to proof mining and present some recent applications in
nonlinear analysis and convex optimization.
Andrei Sipoș (LOS and IMAR)
On abstract proximal point algorithms and related concepts
Abstract: The proximal point algorithm is a fundamental tool of convex optimization, traditionally
attributed to Martinet [1], Rockafellar [2] (who named it) and Brézis and Lions [3]. In its many
variants, it usually operates by iterating on a starting point a sequence of mappings dubbed
``resolvents", whose fixed points coincide with the solutions of the optimization problem that
one is aiming at. It has been observed by Eckstein [4] that the arguments used to prove the convergence
of this algorithm ``hinge primarily on the firmly nonexpansive properties of the resolvents".
Inspired by this remark, the speaker, together with L. Leuştean and A. Nicolae, has introduced
in [5] the notion of jointly firmly nonexpansive family of mappings, which allows for a highly
abstract proof of the proximal point algorithm's convergence, encompassing virtually all known
variants in the literature. We have recently revisited [6] the concept, providing a conceptual
characterization of it and showing how it may be used to prove other kinds of results usually
associated with resolvent-type mappings. In this talk, we shall report on all these findings
and point towards works in progress and future developments.
References:
[1] B. Martinet,
Régularisation d'inéquations variationnelles par approximations successives.
Rev. Française Informat. Recherche Opérationnelle 4, 154–158, 1970.
[2] R. T. Rockafellar, Monotone
operators and the proximal point algorithm.
SIAM J. Control Optim. 14, 877–898, 1976.
[3] H. Brézis, P. L. Lions,
Produits infinis de resolvantes. Israel J. Math. 29, 329–345, 1978.
[4] J. Eckstein, The
Lions-Mercier Splitting Algorithm and the Alternating Direction Method
are Instances of the Proximal Point Algorithm. Report LIDS-P-1769, Laboratory for
Information and Decision Sciences, MIT, 1989.
[5] L. Leuştean, A. Nicolae, A. Sipoş, An abstract
proximal point algorithm. Journal of Global Optimization, Volume 72, Issue 3, 553–577, 2018.
DOI: 10.1007/s10898-018-0655-9.
[6] A. Sipoş, Revisiting jointly firmly nonexpansive
families of mappings. Optimization (2021).
DOI: 10.1080/02331934.2021.1915312.
Andrei Pătraşcu (LOS)
On complexity of the first-order algorithms for convex optimization. Part II
Andrei Pătraşcu (LOS)
On complexity of the first-order algorithms for convex optimization
Abstract: The potential of the first-order algorithms in the big data context has been confirmed
repeatedly in the last decades, in fields like machine learning, signal processing and computer science.
In this seminar we will introduce and analyze the main first-order methods successfully used to solve
many convex optimization models under various smoothness and convexity features.
Our analysis will provide insights on their iterative behaviour and iteration complexity for
deterministic and stochastic (noisy) settings. Finally, some concrete applications will be discussed.
Cristi Rusu (LOS)
Learning orthonormal dictionaries as efficient as the Fast Fourier Transform
Abstract: Dictionary learning is a popular technique in signal
processing and machine learning used to build simple or efficient
representations of data. In this talk, we will look at this problem when
the dictionary we construct is constrained to be orthonormal. We discuss
why is property is desirable and how to introduce additional properties
such as numerical efficiency. The goal is to show how to learn
transformations from data that are at least as efficient as the Fast
Fourier Transform (FFT). We will discuss applications and extensions of
this work.
Florin Stoican (Polytechnic University of Bucharest and LOS)
About the use of B-spline functions in motion planning
Abstract:
B-spline functions are a popular choice for describing trajectories associated with nonlinear
dynamics and for imposing continuous-time constraint validation. Exploiting their properties
(local support, convexity and positivity) usually leads to sufficient conditions which are
conservative. In this talk I will present two methods which reduce the conservatism: the
first makes use of sum-of-squares polynomials to provide a linear matrix inequality-type
formulation giving necessary and sufficient conditions and the second makes use of knot
refinement strategies to improve arbitrarily -well on the sufficient conditions. In both
cases, the approaches are validated for a motion planning problem where off-line trajectories
with obstacle(s) avoidance guarantees are generated.