DACO Seminar

×

Modal title

Modal content

Please subscribe here if you would you like to be notified about these presentations via e-mail. Moreover you can subscribe to the iCal/ics Calender.

Spring Semester 2024

Date / Time Speaker Title Location
* 24 January 2024
10:30-12:00
Prof. Dr. Timo de Wolff
TU Braunschweig
Event Details

DACO Seminar

Title An Introduction to Nonnegativity and Polynomial Optimization
Speaker, Affiliation Prof. Dr. Timo de Wolff, TU Braunschweig
Date, Time 24 January 2024, 10:30-12:00
Location HG G 19.1
Abstract In science and engineering, we regularly face (constrained) polynomial optimization problems (CPOP). That is the task to minimize a real, multivariate polynomial under polynomial con-straints. Solving these problems is essentially equivalent to certifying nonnegativity of real polyno-mials – a key problem in real algebraic geometry since the 19th century. Since this is a notoriously hard to solve problem (e.g., various NP-complete problems admit a CPOP formulation), one is interested in certificates that imply nonnegativity and are easier to check than nonnegativity itself. In particular, a polynomial is nonnegative if it is a sums of squares (SOS) of other polynomials. Being an SOS can be detected effectively via semidefinite programming (SDP) in practice. In 2014, Iliman and I introduced a new certificate of nonnegativity based on sums of nonneg-ative circuit polynomials (SONC), which I have developed further since then both in theory and practice joint with different coauthors. In particular, these certificates are independent of sums of squares and can be computed via relative entropy programs. In this talk, I will give an introduction to polynomial optimization, nonnegativity, SOS, and SONC, and, if time permits, give an examples of applications of SONCs.
An Introduction to Nonnegativity and Polynomial Optimizationread_more
HG G 19.1
20 February 2024
14:15-15:15
Dr. Eren C. Kizildag
Columbia, US
Event Details

DACO Seminar

Title Computational Limits of Random Optimization Problems
Speaker, Affiliation Dr. Eren C. Kizildag, Columbia, US
Date, Time 20 February 2024, 14:15-15:15
Location HG G 19.1
Zoom talk
Abstract Optimization problems with random objective functions are central in computer science, probability, and modern data science. Despite their ubiquity, finding efficient algorithms for solving these problems remains a major challenge. Interestingly, many random optimization problems share a common feature, dubbed as statistical-computational gap: while the optimal value can be pinpointed non-constructively, all known polynomial-time algorithms find strictly sub-optimal solutions. That is, an optimal solution can only be found through brute force search which is computationally expensive. In this talk, I will discuss an emerging theoretical framework for understanding the computational limits of random optimization problems, based on the Overlap Gap Property (OGP). This is an intricate geometrical property that achieves sharp algorithmic lower bounds against the best known polynomial-time algorithms for a wide range of random optimization problems. I will focus on two models to demonstrate the power of the OGP framework: (a) the symmetric binary perceptron, a simple neural network classifying/storing random patterns and a random constraint satisfaction problem, widely studied in probability, statistics, and computer science, and (b) the random number partitioning problem as well as its planted counterpart, which is closely related to the design of randomized controlled trials. In addition to yielding sharp algorithmic lower bounds, our techniques also give rise to new toolkits for the study of statistical-computational gaps in other models, including the online setting.
Computational Limits of Random Optimization Problemsread_more
HG G 19.1
Zoom talk
* 22 February 2024
11:15-12:15
Sarah Timhadjelt
Aix-Marseille Université, France
Event Details

DACO Seminar

Title Second largest eigenvalue of the sum of a deterministic matrix and a random permutation
Speaker, Affiliation Sarah Timhadjelt, Aix-Marseille Université, France
Date, Time 22 February 2024, 11:15-12:15
Location HG G 19.1
Abstract We consider a random bistochastic matrix of size N of the form (1-r)M + rQ where 0
Second largest eigenvalue of the sum of a deterministic matrix and a random permutationread_more
HG G 19.1
* 8 March 2024
11:15-12:15
Aram-Alexandre Pooladian
NYU, US
Event Details

DACO Seminar

Title Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space
Speaker, Affiliation Aram-Alexandre Pooladian, NYU, US
Date, Time 8 March 2024, 11:15-12:15
Location HG G 43
Abstract We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$. Joint work with Yiheng Jiang and Sinho Chewi.
Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein spaceread_more
HG G 43
* 19 March 2024
10:15-11:15
Prof. Dr. Matus Telgarsky
NYU, US
Event Details

DACO Seminar

Title Feature learning, lower-homogeneity, and normalization layers
Speaker, Affiliation Prof. Dr. Matus Telgarsky, NYU, US
Date, Time 19 March 2024, 10:15-11:15
Location HG G 43
Abstract The first half of this talk will describe the feature learning problem in deep learning optimization, its statistical consequences, and an approach to proving general theorems with a heavy reliance on normalization layers, which are common to all modern architectures but typically treated as an analytic nuisance. Theorems will cover two settings: concrete results for shallow networks, and abstract template theorems for general architectures. The shallow network results allow for globally maximal margins at the cost of large width and no further assumptions, while the general architecture theorems give convergence rates to KKT points for a new general class of architectures satisfying “partial lower-homogeneity”. The second half will be technical, demonstrating two core proof techniques. The first ingredient, essential to the shallow analysis, is a new mirror descent lemma, strengthening a beautiful idea discovered by Chizat and Bach. The second ingredient is the concept of “partial lower-homogeneity” and its consequences. Joint work with Danny Son; not currently on arXiv, but “coming soon”.
Feature learning, lower-homogeneity, and normalization layersread_more
HG G 43
19 March 2024
14:15-15:15
Christopher Criscitiello
EPFL, CH
Event Details

DACO Seminar

Title The sensor network localization problem has benign landscape under mild rank relaxation
Speaker, Affiliation Christopher Criscitiello, EPFL, CH
Date, Time 19 March 2024, 14:15-15:15
Location HG G 19.1
Abstract We consider the sensor network localization problem (also called metric multidimensional scaling): we observe some pairwise distances between n ground-truth points in R^d, and our goal is to recover this cloud of ground-truth points (up to translation and rotation). The corresponding optimization problem is nonconvex, and we show that it can have spurious local minima. However, inspired by numerical experiments, we argue that if one relaxes the problem by optimizing over clouds of n points in dimension k greater than d, then all second-order critical points of the problem are global minima. Specifically, we show this for two settings: (1) for arbitrary ground-truth points, when all pairwise distances are known, and k = O(sqrt{n d}), and: (2) for isotropic random ground-truth points, when most (but not necessarily all) pairwise distances are known, and k = O(d log(n)). To the best of our knowledge, these are the first landscape results for this nonconvex version of sensor network localization.
The sensor network localization problem has benign landscape under mild rank relaxationread_more
HG G 19.1
9 April 2024
14:15-15:15
Jingqiu Ding
ETH Zürich, CH
Event Details

DACO Seminar

Title Private graphon estimation via sum-of-squares
Speaker, Affiliation Jingqiu Ding, ETH Zürich, CH
Date, Time 9 April 2024, 14:15-15:15
Location HG G 19.1
Abstract We develop the first pure node-differentially-private algorithms for learning stochastic block models and for graphon estimation with polynomial running time for any constant number of blocks. The statistical utility guarantees match those of the previous best information-theoretic (exponential-time) node-private mechanisms for these problems. The algorithm is based on an exponential mechanism for a score function defined in terms of a sum-of-squares relaxation whose level depends on the number of blocks. The key ingredients of our results are (1) a characterization of the distance between the block graphons in terms of a quadratic optimization over the polytope of doubly stochastic matrices, (2) a general sum-of-squares convergence result for polynomial optimization over arbitrary polytopes, and (3) a general approach to perform Lipschitz extensions of score functions as part of the sum-of-squares algorithmic paradigm.
Private graphon estimation via sum-of-squaresread_more
HG G 19.1
* 10 April 2024
10:15-12:00
Prof. Dr. Shahar Mendelson
Australian National University
Event Details

DACO Seminar

Title Proof of the Majorizing Measure Theorem: Part I
Speaker, Affiliation Prof. Dr. Shahar Mendelson, Australian National University
Date, Time 10 April 2024, 10:15-12:00
Location HG G 43
Abstract As part of a mini-course hosted by the FIM, the speaker will give a complete proof of the Majorizing Measures Theorem (roughly 4h of lecture). In the occasion of the announcement of the awarding of the Abel prize to Michel Talagrand, the organisers suggested, and the speaker kindly agreed, to isolate this part of the course as a 4h event, where the proof (and some of the motivation) will be made accessible also to audience that is not attending the course. This will take place on 10.04.24 split in two two-hour sessions, one in the morning (10:15) and another in the afternoon (14:15), both at G43. All are welcome and we hope to see many of you there!
Proof of the Majorizing Measure Theorem: Part Iread_more
HG G 43
* 10 April 2024
14:15-16:00
Prof. Dr. Shahar Mendelson
Australian National University
Event Details

DACO Seminar

Title Proof of the Majorizing Measure Theorem: Part II
Speaker, Affiliation Prof. Dr. Shahar Mendelson, Australian National University
Date, Time 10 April 2024, 14:15-16:00
Location HG G 43
Abstract As part of a mini-course hosted by the FIM, the speaker will give a complete proof of the Majorizing Measures Theorem (roughly 4h of lecture). In the occasion of the announcement of the awarding of the Abel prize to Michel Talagrand, the organisers suggested, and the speaker kindly agreed, to isolate this part of the course as a 4h event, where the proof (and some of the motivation) will be made accessible also to audience that is not attending the course. This will take place on 10.04.24 split in two two-hour sessions, one in the morning (10:15) and another in the afternoon (14:15), both at G43. All are welcome and we hope to see many of you there!
Proof of the Majorizing Measure Theorem: Part IIread_more
HG G 43
* 29 May 2024
09:15-18:00
Various Speakers

Event Details

DACO Seminar

Title Random Matrix Day
Speaker, Affiliation Various Speakers,
Date, Time 29 May 2024, 09:15-18:00
Location HG E 41
Abstract More information to be announced closer to date.
Random Matrix Dayread_more
HG E 41

Notes: the highlighted event marks the next occurring event and events marked with an asterisk (*) indicate that the time and/or location are different from the usual time and/or location.

JavaScript has been disabled in your browser