Weekly Bulletin
The FIM provides a Newsletter called FIM Weekly Bulletin, which is a selection of the mathematics seminars and lectures taking place at ETH Zurich and at the University of Zurich. It is sent by e-mail every Tuesday during the semester, or can be accessed here on this website at any time.
Subscribe to the Weekly Bulletin
FIM Weekly Bulletin
×
Modal title
Modal content
| Monday, 12 May | |||
|---|---|---|---|
| Time | Speaker | Title | Location |
| 15:15 - 16:15 |
Filip Broćić Universität Augsburg, Germany |
Abstract
In the talk, I will introduce a distance-like function on the zero section of the cotangent bundle using symplectic embeddings of standard balls inside an open neighborhood of the zero section. I will provide some examples which illustrate the properties of such a function. The main result that I will present is a relationship between the length structure associated to the introduced quantity and the usual Riemannian length. This relationship requires upper bound coming from the existence of a J-holomorphic curve, and lower bound coming from an explicit construction which is related to the strong Viterbo conjecture. Time permitting, I will mention two different flavors of the Gromov width that are contained in joint works with Dylan Cant.
Symplectic Geometry SeminarRiemannian distance and symplectic embeddings in cotangent bundlesread_more |
HG G 43 |
| 17:15 - 18:15 |
Andrew Stuart Caltech |
Abstract
A fundamental problem in artificial intelligence
is the question of how to simultaneously deploy data from different sources such as audio, image, text and video; such data is known as multimodal. In this talk I will focus on the canonical problem of aligning image and text data, and describe some of the mathematical ideas underlying the challenge of allowing them to communicate. I will describe the encoding of text and image in Euclidean spaces and describe contrastive learning methodology to identify and learn
embeddings which align these two modalities. In so doing I will then describe the attention mechanism, a form of nonlinear transform that quantifies correlation in vector-valued sequences. Attention turns out to be useful beyond this specific context, and I will show how it may be used to design and learn maps
between Banach spaces or between metric spaces of probability measures. The former is useful for accelerating MCMC, and the latter for nonlinear filtering.
ETH-FDS Stiefel LecturesAllowing Image And Text Data To Communicate (Attention Is Sometimes Useful)read_more |
HG F 30 |
| Tuesday, 13 May | |||
|---|---|---|---|
| Time | Speaker | Title | Location |
| 15:15 - 16:15 |
Dr. Alexis Michelat EPFL |
Abstract
We show the upper semi-continuity of the sum of the Morse index and nullity of Willmore immersions of bounded energy. In previous work in collaboration with Tristan Rivière, we had established the result under the assumption that the bubbles appearing in the limit are free of branch points, and in this talk, we will explain how to generalise this result if branch points arise. This will lead us to study fine properties of two families of 4th order elliptic operators with regular singularities.
Analysis SeminarMorse Index Stability of Branched Willmore Immersionsread_more |
HG G 43 |
| 16:30 - 18:30 |
Luca Rubio ETH |
Abstract
The p-tower group of an imaginary quadratic number field K is the Galois group over K of the maximal unramified pro-p extension of K. It is known that the abelianization of the p-tower group is naturally isomorphic to the p-primary part of the narrow ideal class group Cl_K[p]. The Cohen-Lenstra heuristics give a probabilistic explanation for how often Cl_K[p] is isomorphic to a given finite abelian p-group. The goal of this talk is to generalise the Cohen-Lenstra heuristics to the non-abelian case.
Zurich Graduate ColloquiumWhat is... the distribution of p-tower groups?read_more |
KO2 F 150 |
| Wednesday, 14 May | |||
|---|---|---|---|
| Time | Speaker | Title | Location |
| 10:15 - 12:00 |
Adam Kanigowski University of Maryland |
HG G 43 |
|
| 13:30 - 14:30 |
Prof. Dr. Caroline Series University of Warwick |
Abstract
Given a measure preserving action of a group G on a probability space X and a real valued function f on X, we consider the spherical averages S_n(f) of the functions f(g.x) averaged over all elements g of length n in a fixed set of generators. The limiting behaviour of S_n(f) has long been studied. Cesaro convergence has been proved in a wide variety of contexts. Actual convergence (depending on the parity of n) for free groups was proved by Nevo-Stein for f in L^p, p>1 In 2002, Bufetov extended the Nevo-Stein result to a slightly wider class by using a certain self-adjointness property of an associated Markov operator, which in turn depends on the fact that the inverse of a reduced word in a free group is itself reduced. In this talk we explain the same result for a large class of Fuchsian groups with presentations whose relations all have even length. The method relies on a new twist on the Bowen-Series coding for Fuchsian groups: by encoding the set of all shortest words representing a particular group element simultaneously, we obtain a suitable self-adjointness property of an associated Markov operator to which we apply a variant of Bufetov's original proof. This is joint work with Alexander Bufetov and Alexey Klimenko. [Published in Comm Math Helv 2023]
Ergodic theory and dynamical systems seminarConvergence of spherical averages for Fuchsian groupsread_more |
HG F 5 |
| 15:30 - 16:30 |
Stefanie Zbindencall_made Heriot-Watt University |
Abstract
In the realm of CAT(0) groups, there exists the following powerful dichotomy. Either the group has linear divergence, in which case all asymptotic cones are cut-point free, or the group has a Morse geodesic, in which case all asymptotic cones have cut-points and the group is acylindrically hyperbolic. This talk focuses on work in progress with Cornelia Drutu and Davide Spriano, where we show that the above dichotomy holds for a larger class of groups. In particular, that it holds for groups acting ''nicely'' on injective metric spaces and geodesic median spaces. The main tool of the proof is the contraction space construction, a construction which assigns a hyperbolic space to any given geodesic metric space. We will introduce and motivate this construction and outline how it can be used in the proof of the dichotomy.
Geometry SeminarThe contraction space and its applicationsread_more |
HG G 43 |
| 16:30 - 17:30 |
Prof. Dr. Michael Feischl TU Wien |
Abstract
We present two recent results on the convergence rates of algorithms involving neural networks:
First, we propose a hierarchical training algorithm for standard feed-forward neural networks that adaptively extends the network architecture as soon as the optimization reaches a stationary point. By solving small (low-dimensional) optimization problems, the extended network provably escapes any local minimum or stationary point. Under some assumptions on the approximability of the data with stable neural networks, we show that the algorithm achieves an optimal convergence rate s in the sense that loss is bounded by the number of parameters to the -s.
Second, we show that quadrature with neural network integrands is inherently hard and that no higher-order algorithms can exist, even if the algorithm has access to the weights of the network.
Zurich Colloquium in Applied and Computational MathematicsOptimal convergence rates in the context of neural networksread_more |
HG G 19.2 |
| 17:15 - 18:00 |
Prof. em. Dr. Marc Burgercall_made ETH Zurich, Switzerland |
HG F 30 |
|
| 17:15 - 18:30 |
Prof. em. Dr. Marc Burgercall_made ETH Zurich, Switzerland |
HG F 30 |
|
| Thursday, 15 May | |||
|---|---|---|---|
| Time | Speaker | Title | Location |
| 16:00 - 17:00 |
Peter Bühlmann ETH Zürich |
Abstract
'Perturbation Data Science' refers to the development of data science methods and algorithms that leverage the effects of perturbations -- often unspecific -- within data. We will focus on a key aspect of this framework, exploring the links between invariance learning, robustness, and causality. We will highlight how these concepts have been applied to medical and public health problems, and we will emphasize the importance of validating machine learning and AI algorithms on perturbations that occur in real-world problems.
ITS Science ColloquiumMore information: https://eth-its.ethz.ch/activities/its-science-colloquium.htmlcall_made Perturbation Data Scienceread_more |
HG E 3 |
| 16:15 - 17:15 |
Jingyin Huangcall_made Ohio State University |
onlinecall_made | |
| 16:15 - 18:00 |
Dr. Asbjørn Bækgaard Lauritsen Paris Dauphine |
Abstract
Recently much interest has been given to the study of dilute interacting Bose and Fermi gases. For both the Bose gas and Fermi gas with spin the ground state energy densities differ to leading order from that of the free (non-interacting) gases by a term of order $a_s \rho^{2}$ with $a_s$ the $s$-wave scattering length of the repulsive interaction. In contrast, for a spin-polarized Fermi gas, the difference is instead of order $a_p \rho^{8/3}$ with $a_p$ the $p$-wave scattering length. I will discuss some intuition behind these results and in particular how the $p$-wave term is related to the Pauli exclusion principle. Joint work with Robert Seiringer.
PDE and Mathematical PhysicsEnergies of dilute Fermi gasesread_more |
Y27 H 46 |
| 17:15 - 18:15 |
Prof. Dr. David Criens call_made University of Freiburg |
Abstract
In this talk we discuss three objects from different areas of mathematics: convex expectations from probability theory, penalty functions from stochastic optimal control and convex semigroups from analysis. In a path space setting, we explain that they have a one-to-one relation. Further, we discuss two important consequences of these relations. First, we show that convex expectations on path space are determined by their finite dimensional distributions. As a consequence, we may relate Hu and Peng’s axiomatic definition of \(G\)-Levy processes to the control description of Neufeld and Nutz. Second, refining this result further, we show that convex expectations with a certain Markovian structure are determined by convex semigroups on the state space. In case time allows, we discuss an application of this result, namely that certain sublinear expectations on path space are characterized uniquely by their infinitesimal structure on the state space and that they admit a stochastic control representation. This talk is based on joint work with Michael Kupper (University of Konstanz): arXiv:2503.10572
Talks in Financial and Insurance MathematicsOn convex expectations, penalty functions and convex semigroups on path spaceread_more |
HG G 43 |
| Friday, 16 May | |||
|---|---|---|---|
| Time | Speaker | Title | Location |
| 10:15 - 12:00 |
Boris Bukh Carnegie Mellon University |
HG G 43 |
|
| 15:15 - 16:00 |
Stephan Mandt University of California, Irvine |
Abstract
Diffusion models have transformed generative modeling in various domains such as vision and language. But can they serve as tools for scientific inference? In this talk, I present a perspective that reframes diffusion models as Bayesian solvers for scientific inverse problems—involving a noisy measurement process--with applications ranging from climate modeling to astrophysical imaging. Scientific use cases demand more than photorealism—they require calibrated uncertainty, distributional fidelity, efficient conditional sampling, and the ability to model heavy-tailed data. I’ll highlight four recent advances developed to meet these needs:
1. Variational Control, an improved framework for conditional generation in pretrained diffusion models (ICML ’25) 2. Heavy-Tailed Diffusion Models, for enabling accurate modeling of sparse and extreme-valued scientific data (ICLR ’25) 3. Conjugate Integrators, for enabling fast conditional sampling without retraining (NeurIPS ’24) 4. Generative Uncertainty for Diffusion Models, for assessing and exploiting epistemic uncertainties in data generation tasks (UAI '25)
Speaker: Prof. Stephan Mandt [He is an Associate Professor of Computer Science and Statistics at the University of California, Irvine. His research contributes to the foundations and applications of generative AI, with a focus on generative modeling of 2D, 3D, and sequential data, compression, resource-efficient learning, inference algorithms, and AI-driven scientific discovery. He is a Chan Zuckerberg Investigator and AI Resident and has received the NSF CAREER Award, the UCI ICS Mid-Career Excellence in Research Award, and a Kavli Fellowship. Before UCI, he led the machine learning group at Disney Research and held postdoctoral positions at Princeton and Columbia. Stephan frequently serves as a Senior Area Chair for NeurIPS, ICML, and ICLR and was most recently Program Chair for AISTATS 2024 and General Chair for AISTATS 2025.]
ZueKoSt: Seminar on Applied StatisticsScientific Inference with Diffusion Generative Modelsread_more |
HG E 33.5 |
| 16:00 - 17:30 |
Dr. Henry Liu Kavli IPMU (Tokyo) |
Abstract
I will explain master space techniques for performing
wall-crossing in moduli of sheaves on surfaces and similar spaces. For
instance, such techniques can be used to give a topological criterion for the invariance of (equivariant) elliptic genus in variation of GIT. Joyce recently discovered that the resulting wall-crossing formulas, in (co)homology, are controlled by a vertex algebra. This result may be lifted to equivariant K-theory, and, in joint work with Nikolas Kuhn and Felix Thimm, to moduli of sheaves on Calabi-Yau 3-folds and similar spaces. The key technique is a "symmetrized pullback" operation for symmetric obstruction theories. An immediate application is an explicit Donaldson-Thomas/Pandharipande-Thomas correspondence for descendent vertices.
Algebraic Geometry and Moduli SeminarWall-crossing in cohomology, K-theory, and beyondread_more |
HG G 43 |