DACO seminar

×

Modal title

Modal content

Bitte abonnieren Sie hier, wenn Sie über diese Veranstaltungen per E-Mail benachrichtigt werden möchten. Ausserdem können Sie auch den iCal/ics-Kalender abonnieren.

Herbstsemester 2023

Datum / Zeit Referent:in Titel Ort
* 3. Oktober 2023
15:05-16:00
Gabriel Arpino
University of Cambridge
Details

DACO Seminar

Titel Statistical-Computational Tradeoffs in Mixed Sparse Linear Regression
Referent:in, Affiliation Gabriel Arpino, University of Cambridge
Datum, Zeit 3. Oktober 2023, 15:05-16:00
Ort HG G 19.2
Abstract We consider the problem of mixed sparse linear regression with two components, where two sparse signals are observed through n unlabelled noisy linear measurements. Prior work has shown that the problem suffers from a significant statistical-to-computational gap, resembling other computationally challenging high-dimensional inference problems such as Sparse PCA and Robust Sparse Mean Estimation. We establish the existence of a more extensive computational barrier for this problem through the method of low-degree polynomials, but show that the problem is computationally hard only in a very narrow symmetric parameter regime. We identify smooth information-computation tradeoffs in this problem and prove that a simple linear-time algorithm succeeds outside of the narrow hard regime. To the best of our knowledge, this is the first thorough study of the interplay between mixture symmetry, signal sparsity, and their joint impact on the computational hardness of mixed sparse linear regression. This is joint work with Ramji Venkataramanan. https://proceedings.mlr.press/v195/arpino23a.html.
Statistical-Computational Tradeoffs in Mixed Sparse Linear Regressionread_more
HG G 19.2
17. Oktober 2023
14:15-15:15
Dr. Andrew McRae
EPFL
Details

DACO Seminar

Titel Benign nonconvexity in overparametrized group synchronization
Referent:in, Affiliation Dr. Andrew McRae, EPFL
Datum, Zeit 17. Oktober 2023, 14:15-15:15
Ort HG G 19.1
Abstract I consider an optimization problem arising in orthogonal group synchronization, in which we want to estimate orthogonal matrices from (potentially noisy) relative measurements. The naïve least-squares estimator over orthogonal matrices requires solving a nonconvex program that, in general, has many spurious local minima. We show that adding a small number of degrees of freedom (specifically, relaxing to optimization over slightly “wider” Stiefel manifold matrices) makes the nonconvexity benign in that every second-order critical point is a global minimum and, in fact, yields an optimal solution to the original unrelaxed problem. In the noiseless measurement case, our results are tight and solve a previous conjecture in synchronization over Stiefel manifolds. The key proof innovation is a new randomized perturbation direction. Joint work with Nicolas Boumal; https://arxiv.org/abs/2307.02941.
Benign nonconvexity in overparametrized group synchronizationread_more
HG G 19.1
24. Oktober 2023
14:15-15:15
Dr. Daria Tieplova
ICTP, Trieste
Details

DACO Seminar

Titel Fundamental limits of overparametrized shallow neural networks for supervised learning
Referent:in, Affiliation Dr. Daria Tieplova, ICTP, Trieste
Datum, Zeit 24. Oktober 2023, 14:15-15:15
Ort HG G 19.1
Abstract I will discuss the joint work done with Francesco Camilli and Jean Barbier concerning an information-theoretical analysis of a two-layer neural network trained from input-output pairs generated by a teacher network with matching architecture, in overparametrized regimes. Our results come in the form of bounds relating i) the mutual information between training data and network weights, or ii) the Bayes-optimal generalization error, to the same quantities but for a simpler (generalized) linear model for which explicit expressions are rigorously known. Our bounds, which are expressed in terms of the number of training samples, input dimension and number of hidden units, thus yield fundamental performance limits for any neural network (and actually any learning procedure) trained from limited data generated according to our two-layer teacher neural network model. The proof relies on rigorous tools from spin glasses and is guided by ``Gaussian equivalence principles'' lying at the core of numerous recent analyses of neural networks. With respect to the existing literature, which is either non-rigorous or restricted to the case of the learning of the readout weights only, our results are information-theoretic (i.e. are not specific to any learning algorithm) and, importantly, cover a setting where all the network parameters are trained.
Fundamental limits of overparametrized shallow neural networks for supervised learningread_more
HG G 19.1
31. Oktober 2023
14:15-15:15
Prof. Dr. Ivan Dokmanić
University of Basel
Details

DACO Seminar

Titel Statistical Mechanics of Graph Convolution Networks
Referent:in, Affiliation Prof. Dr. Ivan Dokmanić, University of Basel
Datum, Zeit 31. Oktober 2023, 14:15-15:15
Ort HG G 19.1
Abstract Graph neural networks (GNNs) excel in modeling relational data such as biological, social, and transportation networks, but the underpinnings of their success are elusive. Traditional complexity measures from statistical learning theory fail to account for observed phenomena like the double descent or the impact of relational semantics on generalization error. Motivated by experimental observations of ``transductive'' double descent in key networks and datasets, we use analytical tools from statistical physics and random matrix theory to precisely characterize generalization in simple graph convolution networks on the contextual stochastic block model. Our results illuminate the nuances of learning on homophilic versus heterophilic data and predict double descent whose existence in GNNs has been questioned by recent work. We show how risk is shaped by the interplay between the graph noise, feature noise, and the number of training labels. Our findings apply beyond stylized models, capturing qualitative trends in real-world GNNs and datasets. As a case in point, we use our analytic insights to improve performance of state-of-the-art graph convolution networks on heterophilic datasets.
Statistical Mechanics of Graph Convolution Networksread_more
HG G 19.1
2. November 2023
16:15-17:15
Dr. Quentin Berthet
Google DeepMind
Details

DACO Seminar

Titel DACO-FDS: Perturbed Optimizers for Machine Learning
Referent:in, Affiliation Dr. Quentin Berthet, Google DeepMind
Datum, Zeit 2. November 2023, 16:15-17:15
Ort HG G 19.1
Abstract Machine learning pipelines often rely on optimizers procedures to make discrete decisions (e.g., sorting, picking closest neighbors, or shortest paths). Although these discrete decisions are easily computed in a forward manner, they break the back-propagation of computational graphs. In order to expand the scope of learning problems that can be solved in an end-to-end fashion, we propose a systematic method to transform optimizers into operations that are differentiable and never locally constant. Our approach relies on stochastically perturbed optimizers, and can be used readily within existing solvers. Their derivatives can be evaluated efficiently, and smoothness tuned via the chosen noise amplitude. We also show how this framework can be connected to a family of losses developed in structured prediction, and give theoretical guarantees for their use in learning tasks. We demonstrate experimentally the performance of our approach on various tasks, including recent applications on protein sequences.
DACO-FDS: Perturbed Optimizers for Machine Learningread_more
HG G 19.1
14. November 2023
14:15-15:15
Freya Behrens
EPFL
Details

DACO Seminar

Titel The statistical mechanics of synchronous local processes on graphs
Referent:in, Affiliation Freya Behrens, EPFL
Datum, Zeit 14. November 2023, 14:15-15:15
Ort HG G 19.1
Abstract The friendly partition problem involves determining whether a given graph allows for a partition of its nodes into two nonempty sets, where each node has at least as many neighbors in its own set as in the other. Notably, not all graphs permit such a friendly partition, and even fewer accommodate a partition where a node requires an additional margin of neighbors in its own set compared to the other. We investigate the existence of such partitions and the algorithmic feasibility of finding them. A natural question is: how does a graph evolve when nodes directly adapt their states to meet these local constraints? When this adaptation occurs synchronously, it models scenarios like majority voting or cellular automata. However, it is not a given that a graph, where each node greedily iteratively applies the local rule, converges to a global solution. Our analysis examines the different types of attractors that emerge in locally constrained problems and the role of initialisation in shaping the outcome. Our tool to answer these questions is the cavity method and the backtracking dynamical cavity method from statistical physics for synchronous update processes on regular graphs. They provide the sharp transitions on the existence of solutions, as well as the dynamical phase transitions of local processes in the large system limit.
The statistical mechanics of synchronous local processes on graphsread_more
HG G 19.1
28. November 2023
14:15-15:15
Prof. Dr. Christian Brennecke
University of Bonn
Details

DACO Seminar

Titel Operator Norm Bounds on the Correlation Matrix of the SK Model
Referent:in, Affiliation Prof. Dr. Christian Brennecke, University of Bonn
Datum, Zeit 28. November 2023, 14:15-15:15
Ort HG G 19.1
Abstract In this talk I will review basic predictions for the high temperature regime, the so called replica symmetric regime, of the Sherrington-Kirkpatrick mean field spin glass. I will recall the TAP equations and their derivation in connection with the decay of the two point correlation functions. For the simplified case of vanishing external field, I will present some details on recent results that characterize the susceptibility of the model as a resolvent of the interaction matrix, which predicts in a simple way the (well-known) RS-RSB transition temperature. The talk is based on joint work with Adrien Schertzer, Changji Xu and Horng-Tzer Yau.
Operator Norm Bounds on the Correlation Matrix of the SK Modelread_more
HG G 19.1

Hinweise: mit einem Stern gekennzeichnete Ereignisse (*) zeigen an, dass die Zeit und/oder der Ort von der üblichen Zeit und/oder dem üblichen Ort abweichen.

JavaScript wurde auf Ihrem Browser deaktiviert