Veranstaltungen

Diese Woche

×

Modal title

Modal content
Montag, 31. März
Dienstag, 1. April
Mittwoch, 2. April
Donnerstag, 3. April
Zeit Referent:in Titel Ort
14:30 - 16:00 Silvio Barandun
Examiner: Prof. H. Ammari
Abstract
Doctoral Exam
Foundations of the Skin Effect and Bulk Localisation in Resonator Systems
HG G 43
16:15 - 17:15 Giacomo Cozzi
Università degli Studi di Padova
Abstract
The aim of this talk is to present the theory of gradient flows on metric space. Given a functional defined on a Hilbert space, its gradient flow is the curve minimizing the functional in the fastest way possible, namely, following the opposite direction of its gradient. Starting from the pioneristic work of De Giorgi, it became possible to give a meaning to gradient flows even in spaces where the definition of gradient is not natural (i.e. on spaces which are not Hilbert). An important application is the case of gradient flows defined on the space of probability measures endowed with the Wasserstein distance. Using this theory, we will discuss two examples in which the functionals to be minimized are nonlocal (i.e., long range interaction) energies.
Geometry Graduate Colloquium
Gradient Flows of Nonlocal Energies
HG G 19.2
16:15 - 18:00 Prof. Dr. Annalaura Stingo
Ecole Polytechnique
Abstract
In the derivation of the kinetic equation from the cubic NLS, a key feature is the invariance of the Schrödinger equation under the action of U(1), which allows the quasi-resonances of the equation to drive the effective dynamics of the correlations. In this talk, I will give an example of equation that does not enjoy such type of invariance and show that the exact resonances always take precedence over quasi-resonances. As a result, the effective dynamics is not of kinetic type but still nonlinear and non-trivial. I will present the problem, the ideas behind the derivation of the effective dynamics and some elements of the proof. This is based on a soon-to-appear work in collaboration with de Suzzoni (Ecole Polytechnique) and Touati (CNRS and Université de Bordeaux).
PDE and Mathematical Physics
Trivial resonances for a system of Klein-Gordon equations and statistical applications
Y27 H 46
17:15 - 18:15 Dr. Adrian Riekert
University of Münster
Abstract
We study the situation of optimizing artificial neural networks (ANNs) with the rectified linear unit activation via gradient flow (GF), the continuous-time analogue of gradient descent. Under suitable regularity assumptions on the target function and the input data of the considered supervised learning problem, we prove that every non-divergent GF trajectory converges with a polynomial rate of convergence to a critical point. The proof relies on a generalized Kurdyka-Lojasiewicz gradient inequality for the risk function. Furthermore, in a simplified shallow ANN training situation, we show that the GF with suitable random initialization converges with high probability to a good critical point with a loss value very close to the global optimum of the loss.
Talks in Financial and Insurance Mathematics
Convergence of gradient methods in the training of neural networks
HG G 43
Freitag, 4. April