DACO seminar

×

Modal title

Modal content

Bitte abonnieren Sie hier, wenn Sie über diese Veranstaltungen per E-Mail benachrichtigt werden möchten. Ausserdem können Sie auch den iCal/ics-Kalender abonnieren.

Frühjahrssemester 2024

Datum / Zeit Referent:in Titel Ort
* 24. Januar 2024
10:30-12:00
Prof. Dr. Timo de Wolff
TU Braunschweig
Details

DACO Seminar

Titel An Introduction to Nonnegativity and Polynomial Optimization
Referent:in, Affiliation Prof. Dr. Timo de Wolff, TU Braunschweig
Datum, Zeit 24. Januar 2024, 10:30-12:00
Ort HG G 19.1
Abstract In science and engineering, we regularly face (constrained) polynomial optimization problems (CPOP). That is the task to minimize a real, multivariate polynomial under polynomial con-straints. Solving these problems is essentially equivalent to certifying nonnegativity of real polyno-mials – a key problem in real algebraic geometry since the 19th century. Since this is a notoriously hard to solve problem (e.g., various NP-complete problems admit a CPOP formulation), one is interested in certificates that imply nonnegativity and are easier to check than nonnegativity itself. In particular, a polynomial is nonnegative if it is a sums of squares (SOS) of other polynomials. Being an SOS can be detected effectively via semidefinite programming (SDP) in practice. In 2014, Iliman and I introduced a new certificate of nonnegativity based on sums of nonneg-ative circuit polynomials (SONC), which I have developed further since then both in theory and practice joint with different coauthors. In particular, these certificates are independent of sums of squares and can be computed via relative entropy programs. In this talk, I will give an introduction to polynomial optimization, nonnegativity, SOS, and SONC, and, if time permits, give an examples of applications of SONCs.
An Introduction to Nonnegativity and Polynomial Optimizationread_more
HG G 19.1
20. Februar 2024
14:15-15:15
Dr. Eren C. Kizildag
Columbia, US
Details

DACO Seminar

Titel Computational Limits of Random Optimization Problems
Referent:in, Affiliation Dr. Eren C. Kizildag, Columbia, US
Datum, Zeit 20. Februar 2024, 14:15-15:15
Ort HG G 19.1
Zoom talk
Abstract Optimization problems with random objective functions are central in computer science, probability, and modern data science. Despite their ubiquity, finding efficient algorithms for solving these problems remains a major challenge. Interestingly, many random optimization problems share a common feature, dubbed as statistical-computational gap: while the optimal value can be pinpointed non-constructively, all known polynomial-time algorithms find strictly sub-optimal solutions. That is, an optimal solution can only be found through brute force search which is computationally expensive. In this talk, I will discuss an emerging theoretical framework for understanding the computational limits of random optimization problems, based on the Overlap Gap Property (OGP). This is an intricate geometrical property that achieves sharp algorithmic lower bounds against the best known polynomial-time algorithms for a wide range of random optimization problems. I will focus on two models to demonstrate the power of the OGP framework: (a) the symmetric binary perceptron, a simple neural network classifying/storing random patterns and a random constraint satisfaction problem, widely studied in probability, statistics, and computer science, and (b) the random number partitioning problem as well as its planted counterpart, which is closely related to the design of randomized controlled trials. In addition to yielding sharp algorithmic lower bounds, our techniques also give rise to new toolkits for the study of statistical-computational gaps in other models, including the online setting.
Computational Limits of Random Optimization Problemsread_more
HG G 19.1
Zoom talk
* 22. Februar 2024
11:15-12:15
Sarah Timhadjelt
Aix-Marseille Université, France
Details

DACO Seminar

Titel Second largest eigenvalue of the sum of a deterministic matrix and a random permutation
Referent:in, Affiliation Sarah Timhadjelt, Aix-Marseille Université, France
Datum, Zeit 22. Februar 2024, 11:15-12:15
Ort HG G 19.1
Abstract We consider a random bistochastic matrix of size N of the form (1-r)M + rQ where 0
Second largest eigenvalue of the sum of a deterministic matrix and a random permutationread_more
HG G 19.1
* 8. März 2024
11:15-12:15
Aram-Alexandre Pooladian
NYU, US
Details

DACO Seminar

Titel Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space
Referent:in, Affiliation Aram-Alexandre Pooladian, NYU, US
Datum, Zeit 8. März 2024, 11:15-12:15
Ort HG G 43
Abstract We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$. Joint work with Yiheng Jiang and Sinho Chewi.
Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein spaceread_more
HG G 43
* 19. März 2024
10:15-11:15
Prof. Dr. Matus Telgarsky
NYU, US
Details

DACO Seminar

Titel Feature learning, lower-homogeneity, and normalization layers
Referent:in, Affiliation Prof. Dr. Matus Telgarsky, NYU, US
Datum, Zeit 19. März 2024, 10:15-11:15
Ort HG G 43
Abstract The first half of this talk will describe the feature learning problem in deep learning optimization, its statistical consequences, and an approach to proving general theorems with a heavy reliance on normalization layers, which are common to all modern architectures but typically treated as an analytic nuisance. Theorems will cover two settings: concrete results for shallow networks, and abstract template theorems for general architectures. The shallow network results allow for globally maximal margins at the cost of large width and no further assumptions, while the general architecture theorems give convergence rates to KKT points for a new general class of architectures satisfying “partial lower-homogeneity”. The second half will be technical, demonstrating two core proof techniques. The first ingredient, essential to the shallow analysis, is a new mirror descent lemma, strengthening a beautiful idea discovered by Chizat and Bach. The second ingredient is the concept of “partial lower-homogeneity” and its consequences. Joint work with Danny Son; not currently on arXiv, but “coming soon”.
Feature learning, lower-homogeneity, and normalization layersread_more
HG G 43
19. März 2024
14:15-15:15
Christopher Criscitiello
EPFL, CH
Details

DACO Seminar

Titel The sensor network localization problem has benign landscape under mild rank relaxation
Referent:in, Affiliation Christopher Criscitiello, EPFL, CH
Datum, Zeit 19. März 2024, 14:15-15:15
Ort HG G 19.1
Abstract We consider the sensor network localization problem (also called metric multidimensional scaling): we observe some pairwise distances between n ground-truth points in R^d, and our goal is to recover this cloud of ground-truth points (up to translation and rotation). The corresponding optimization problem is nonconvex, and we show that it can have spurious local minima. However, inspired by numerical experiments, we argue that if one relaxes the problem by optimizing over clouds of n points in dimension k greater than d, then all second-order critical points of the problem are global minima. Specifically, we show this for two settings: (1) for arbitrary ground-truth points, when all pairwise distances are known, and k = O(sqrt{n d}), and: (2) for isotropic random ground-truth points, when most (but not necessarily all) pairwise distances are known, and k = O(d log(n)). To the best of our knowledge, these are the first landscape results for this nonconvex version of sensor network localization.
The sensor network localization problem has benign landscape under mild rank relaxationread_more
HG G 19.1
9. April 2024
14:15-15:15
Jingqiu Ding
ETH Zürich, CH
Details

DACO Seminar

Titel Private graphon estimation via sum-of-squares
Referent:in, Affiliation Jingqiu Ding, ETH Zürich, CH
Datum, Zeit 9. April 2024, 14:15-15:15
Ort HG G 19.1
Abstract We develop the first pure node-differentially-private algorithms for learning stochastic block models and for graphon estimation with polynomial running time for any constant number of blocks. The statistical utility guarantees match those of the previous best information-theoretic (exponential-time) node-private mechanisms for these problems. The algorithm is based on an exponential mechanism for a score function defined in terms of a sum-of-squares relaxation whose level depends on the number of blocks. The key ingredients of our results are (1) a characterization of the distance between the block graphons in terms of a quadratic optimization over the polytope of doubly stochastic matrices, (2) a general sum-of-squares convergence result for polynomial optimization over arbitrary polytopes, and (3) a general approach to perform Lipschitz extensions of score functions as part of the sum-of-squares algorithmic paradigm.
Private graphon estimation via sum-of-squaresread_more
HG G 19.1
* 10. April 2024
10:15-12:00
Prof. Dr. Shahar Mendelson
Australian National University
Details

DACO Seminar

Titel Proof of the Majorizing Measure Theorem: Part I
Referent:in, Affiliation Prof. Dr. Shahar Mendelson, Australian National University
Datum, Zeit 10. April 2024, 10:15-12:00
Ort HG G 43
Abstract As part of a mini-course hosted by the FIM, the speaker will give a complete proof of the Majorizing Measures Theorem (roughly 4h of lecture). In the occasion of the announcement of the awarding of the Abel prize to Michel Talagrand, the organisers suggested, and the speaker kindly agreed, to isolate this part of the course as a 4h event, where the proof (and some of the motivation) will be made accessible also to audience that is not attending the course. This will take place on 10.04.24 split in two two-hour sessions, one in the morning (10:15) and another in the afternoon (14:15), both at G43. All are welcome and we hope to see many of you there!
Proof of the Majorizing Measure Theorem: Part Iread_more
HG G 43
* 10. April 2024
14:15-16:00
Prof. Dr. Shahar Mendelson
Australian National University
Details

DACO Seminar

Titel Proof of the Majorizing Measure Theorem: Part II
Referent:in, Affiliation Prof. Dr. Shahar Mendelson, Australian National University
Datum, Zeit 10. April 2024, 14:15-16:00
Ort HG G 43
Abstract As part of a mini-course hosted by the FIM, the speaker will give a complete proof of the Majorizing Measures Theorem (roughly 4h of lecture). In the occasion of the announcement of the awarding of the Abel prize to Michel Talagrand, the organisers suggested, and the speaker kindly agreed, to isolate this part of the course as a 4h event, where the proof (and some of the motivation) will be made accessible also to audience that is not attending the course. This will take place on 10.04.24 split in two two-hour sessions, one in the morning (10:15) and another in the afternoon (14:15), both at G43. All are welcome and we hope to see many of you there!
Proof of the Majorizing Measure Theorem: Part IIread_more
HG G 43
14. Mai 2024
14:15-15:15
Prof. Dr. Apoorva Khare
Indian Institute of Science, IN
Details

DACO Seminar

Titel Entrywise positivity preservers: connecting covariance estimation, analysis, and combinatorics
Referent:in, Affiliation Prof. Dr. Apoorva Khare, Indian Institute of Science, IN
Datum, Zeit 14. Mai 2024, 14:15-15:15
Ort HG G 19.1
Abstract Which functions preserve positive semidefiniteness (psd) when applied entrywise to psd matrices? This question has a long history beginning with Schur, Schoenberg, and Rudin, and has also recently received renewed attention due to applications in high-dimensional statistics. I will explain some of these developments, and connections to other areas. These include analysis results on Fourier sequence preservers (by Rudin) and moment sequence preservers (in recent work). Next come contributions of Loewner, Karlin, and their students: FitzGerald, Horn, Micchelli, and Pinkus, on entrywise maps in all dimensions and in a fixed dimension. I will end with some recent results that describe hitherto undiscovered connections to combinatorics, via a novel graph invariant. (Partly based on joint works with Alexander Belton, Dominique Guillot, Mihai Putinar, Bala Rajaratnam, and Terence Tao.)
Entrywise positivity preservers: connecting covariance estimation, analysis, and combinatoricsread_more
HG G 19.1
* 16. Mai 2024
10:15-11:15
Dr. Tim Kunisky
Yale University, USA
Details

DACO Seminar

Titel Computational hardness of testing for graph lifts and certifying properties of random regular graphs
Referent:in, Affiliation Dr. Tim Kunisky, Yale University, USA
Datum, Zeit 16. Mai 2024, 10:15-11:15
Ort HG G 19.2
Abstract Certification tasks ask an algorithm to output a quantity that is guaranteed to be a bound on an optimization problem. I will present a new form of evidence that certification tasks over random regular graphs---say, certifying bounds on the chromatic number, maximum cut, or expansion of a graph---are computationally hard. Rather than proving lower bounds against concrete certification algorithms like convex relaxations, this argument will draw a connection with computational statistics, showing that a good certification algorithm could solve a hypothesis testing task that we believe to be difficult. The key innovation will be a means of "quietly planting" unusual structures in regular graphs in a way that is hard to detect by taking random lifts of a suitable base graph. For regular graphs of fixed degree, I will give evidence that random lifts are hard to distinguish from uniformly random graphs provided that the base graph is Ramanujan. This gives a surprising connection between certification and the extremal properties of Ramanujan graphs, and allows for simple "proofs by example" of the hardness of certification by merely exhibiting specific finite Ramanujan graphs with various properties. I will show how this approach gives the first evidence for the hardness of certification of many properties of random regular graphs and will present numerous intriguing open problems that it suggests. Based on recent joint work with Xifan Yu.
Computational hardness of testing for graph lifts and certifying properties of random regular graphsread_more
HG G 19.2
* 28. Mai 2024
14:15-15:15
Prof. Dr. Thomas Rothvoss
University of Washington, US
Details

DACO Seminar

Titel The Subspace Flatness Conjecture and Faster Integer Programming
Referent:in, Affiliation Prof. Dr. Thomas Rothvoss, University of Washington, US
Datum, Zeit 28. Mai 2024, 14:15-15:15
Ort HG G 19.2
Abstract In a seminal paper, Kannan and Lov\’asz (1988) considered a quantity $\mu_{KL}(\Lambda,K)$ which denotes the best volume-based lower bound on the \emph{covering radius} $\mu(\Lambda,K)$ of a convex body $K$ with respect to a lattice $\Lambda$. Kannan and Lov\’asz proved that $\mu(\Lambda,K) \leq n \cdot \mu_{KL}(\Lambda,K)$ and the Subspace Flatness Conjecture by Dadush (2012) claims a $O(\log n)$ factor suffices, which would match the lower bound from the work of Kannan and Lov\’asz. We settle this conjecture up to a constant in the exponent by proving that $\mu(\Lambda,K) \leq O(\log^{3}(n)) \cdot \mu_{KL} (\Lambda,K)$. Our proof is based on the Reverse Minkowski Theorem due to Regev and Stephens-Davidowitz (2017). Following the work of Dadush (2012, 2019), we obtain a $(\log n)^{O(n)}$-time randomized algorithm to solve integer programs in $n$ variables. Another implication of our main result is a near-optimal \emph{flatness constant} of $O(n \log^{3}(n))$. This is joint work with Victor Reis.
The Subspace Flatness Conjecture and Faster Integer Programmingread_more
HG G 19.2
* 29. Mai 2024
10:15-17:00
Various Speakers

Details

DACO Seminar

Titel Random Matrix Day
Referent:in, Affiliation Various Speakers,
Datum, Zeit 29. Mai 2024, 10:15-17:00
Ort HG E 41
Abstract A day of talks on Random Matrix Theory, in an informal environment encouraging discussion. The planned scheduled is: 10:15-11:00 Afonso Bandeira 11:15-12:00 Antti Knowles 14:15-15:00 Ashkan Nikeghbali 15:15-16:00 Dominik Schroder 16:15-17:00 Benjamin Schlein
Random Matrix Dayread_more
HG E 41
19. Juni 2024
15:15-16:15
Prof. Dr. Ahmed El Alaoui
Cornell, US
Details

DACO Seminar

Titel Bounding the covariance matrix of a spin system
Referent:in, Affiliation Prof. Dr. Ahmed El Alaoui, Cornell, US
Datum, Zeit 19. Juni 2024, 15:15-16:15
Ort HG G 19.1
Abstract In this talk I will address the question of bonding the operator norm of the covariance matrix of a disordered spin system. A first motivation stems from random matrix theory, where one would like to develop techniques for understanding matrices with subtle dependencies between the entries. A second motivation is about establishing functional inequalities and rapid mixing of Markov chains for these systems, where it has become recently apparent that bounding the norm of certain covariance matrices is a crucial step. I will explain the above connections and present new results in the case of the Sherrington-Kirkpatrick model and the random field Ising model. I will then pose open problems and challenges arising when trying to extend these techniques to other settings of interest. Relevant readings: https://arxiv.org/abs/2212.02445 (with Gaitonde) https://arxiv.org/abs/2311.06171 (with Eldan, Gheissari and Piana) https://arxiv.org/abs/2212.14476 (Brennecke, Schertzer, Xu, Yau)
Bounding the covariance matrix of a spin systemread_more
HG G 19.1

Hinweise: mit einem Stern gekennzeichnete Ereignisse (*) zeigen an, dass die Zeit und/oder der Ort von der üblichen Zeit und/oder dem üblichen Ort abweichen.

JavaScript wurde auf Ihrem Browser deaktiviert