DACO Seminar

×

Modal title

Modal content

Please subscribe here if you would you like to be notified about these presentations via e-mail. Moreover you can subscribe to the iCal/ics Calender.

Autumn Semester 2023

Date / Time Speaker Title Location
* 3 October 2023
15:05-16:00
Gabriel Arpino
University of Cambridge
Event Details

DACO Seminar

Title Statistical-Computational Tradeoffs in Mixed Sparse Linear Regression
Speaker, Affiliation Gabriel Arpino, University of Cambridge
Date, Time 3 October 2023, 15:05-16:00
Location HG G 19.2
Abstract We consider the problem of mixed sparse linear regression with two components, where two sparse signals are observed through n unlabelled noisy linear measurements. Prior work has shown that the problem suffers from a significant statistical-to-computational gap, resembling other computationally challenging high-dimensional inference problems such as Sparse PCA and Robust Sparse Mean Estimation. We establish the existence of a more extensive computational barrier for this problem through the method of low-degree polynomials, but show that the problem is computationally hard only in a very narrow symmetric parameter regime. We identify smooth information-computation tradeoffs in this problem and prove that a simple linear-time algorithm succeeds outside of the narrow hard regime. To the best of our knowledge, this is the first thorough study of the interplay between mixture symmetry, signal sparsity, and their joint impact on the computational hardness of mixed sparse linear regression. This is joint work with Ramji Venkataramanan. https://proceedings.mlr.press/v195/arpino23a.html.
Statistical-Computational Tradeoffs in Mixed Sparse Linear Regressionread_more
HG G 19.2
17 October 2023
14:15-15:15
Dr. Andrew McRae
EPFL
Event Details

DACO Seminar

Title Benign nonconvexity in overparametrized group synchronization
Speaker, Affiliation Dr. Andrew McRae, EPFL
Date, Time 17 October 2023, 14:15-15:15
Location HG G 19.1
Abstract I consider an optimization problem arising in orthogonal group synchronization, in which we want to estimate orthogonal matrices from (potentially noisy) relative measurements. The naïve least-squares estimator over orthogonal matrices requires solving a nonconvex program that, in general, has many spurious local minima. We show that adding a small number of degrees of freedom (specifically, relaxing to optimization over slightly “wider” Stiefel manifold matrices) makes the nonconvexity benign in that every second-order critical point is a global minimum and, in fact, yields an optimal solution to the original unrelaxed problem. In the noiseless measurement case, our results are tight and solve a previous conjecture in synchronization over Stiefel manifolds. The key proof innovation is a new randomized perturbation direction. Joint work with Nicolas Boumal; https://arxiv.org/abs/2307.02941.
Benign nonconvexity in overparametrized group synchronizationread_more
HG G 19.1
24 October 2023
14:15-15:15
Dr. Daria Tieplova
ICTP, Trieste
Event Details

DACO Seminar

Title Fundamental limits of overparametrized shallow neural networks for supervised learning
Speaker, Affiliation Dr. Daria Tieplova, ICTP, Trieste
Date, Time 24 October 2023, 14:15-15:15
Location HG G 19.1
Abstract I will discuss the joint work done with Francesco Camilli and Jean Barbier concerning an information-theoretical analysis of a two-layer neural network trained from input-output pairs generated by a teacher network with matching architecture, in overparametrized regimes. Our results come in the form of bounds relating i) the mutual information between training data and network weights, or ii) the Bayes-optimal generalization error, to the same quantities but for a simpler (generalized) linear model for which explicit expressions are rigorously known. Our bounds, which are expressed in terms of the number of training samples, input dimension and number of hidden units, thus yield fundamental performance limits for any neural network (and actually any learning procedure) trained from limited data generated according to our two-layer teacher neural network model. The proof relies on rigorous tools from spin glasses and is guided by ``Gaussian equivalence principles'' lying at the core of numerous recent analyses of neural networks. With respect to the existing literature, which is either non-rigorous or restricted to the case of the learning of the readout weights only, our results are information-theoretic (i.e. are not specific to any learning algorithm) and, importantly, cover a setting where all the network parameters are trained.
Fundamental limits of overparametrized shallow neural networks for supervised learningread_more
HG G 19.1
31 October 2023
14:15-15:15
Prof. Dr. Ivan Dokmanić
University of Basel
Event Details

DACO Seminar

Title Statistical Mechanics of Graph Convolution Networks
Speaker, Affiliation Prof. Dr. Ivan Dokmanić, University of Basel
Date, Time 31 October 2023, 14:15-15:15
Location HG G 19.1
Abstract Graph neural networks (GNNs) excel in modeling relational data such as biological, social, and transportation networks, but the underpinnings of their success are elusive. Traditional complexity measures from statistical learning theory fail to account for observed phenomena like the double descent or the impact of relational semantics on generalization error. Motivated by experimental observations of ``transductive'' double descent in key networks and datasets, we use analytical tools from statistical physics and random matrix theory to precisely characterize generalization in simple graph convolution networks on the contextual stochastic block model. Our results illuminate the nuances of learning on homophilic versus heterophilic data and predict double descent whose existence in GNNs has been questioned by recent work. We show how risk is shaped by the interplay between the graph noise, feature noise, and the number of training labels. Our findings apply beyond stylized models, capturing qualitative trends in real-world GNNs and datasets. As a case in point, we use our analytic insights to improve performance of state-of-the-art graph convolution networks on heterophilic datasets.
Statistical Mechanics of Graph Convolution Networksread_more
HG G 19.1
2 November 2023
16:15-17:15
Dr. Quentin Berthet
Google DeepMind
Event Details

DACO Seminar

Title DACO-FDS: Perturbed Optimizers for Machine Learning
Speaker, Affiliation Dr. Quentin Berthet, Google DeepMind
Date, Time 2 November 2023, 16:15-17:15
Location HG G 19.1
Abstract Machine learning pipelines often rely on optimizers procedures to make discrete decisions (e.g., sorting, picking closest neighbors, or shortest paths). Although these discrete decisions are easily computed in a forward manner, they break the back-propagation of computational graphs. In order to expand the scope of learning problems that can be solved in an end-to-end fashion, we propose a systematic method to transform optimizers into operations that are differentiable and never locally constant. Our approach relies on stochastically perturbed optimizers, and can be used readily within existing solvers. Their derivatives can be evaluated efficiently, and smoothness tuned via the chosen noise amplitude. We also show how this framework can be connected to a family of losses developed in structured prediction, and give theoretical guarantees for their use in learning tasks. We demonstrate experimentally the performance of our approach on various tasks, including recent applications on protein sequences.
DACO-FDS: Perturbed Optimizers for Machine Learningread_more
HG G 19.1
14 November 2023
14:15-15:15
Freya Behrens
EPFL
Event Details

DACO Seminar

Title The statistical mechanics of synchronous local processes on graphs
Speaker, Affiliation Freya Behrens, EPFL
Date, Time 14 November 2023, 14:15-15:15
Location HG G 19.1
Abstract The friendly partition problem involves determining whether a given graph allows for a partition of its nodes into two nonempty sets, where each node has at least as many neighbors in its own set as in the other. Notably, not all graphs permit such a friendly partition, and even fewer accommodate a partition where a node requires an additional margin of neighbors in its own set compared to the other. We investigate the existence of such partitions and the algorithmic feasibility of finding them. A natural question is: how does a graph evolve when nodes directly adapt their states to meet these local constraints? When this adaptation occurs synchronously, it models scenarios like majority voting or cellular automata. However, it is not a given that a graph, where each node greedily iteratively applies the local rule, converges to a global solution. Our analysis examines the different types of attractors that emerge in locally constrained problems and the role of initialisation in shaping the outcome. Our tool to answer these questions is the cavity method and the backtracking dynamical cavity method from statistical physics for synchronous update processes on regular graphs. They provide the sharp transitions on the existence of solutions, as well as the dynamical phase transitions of local processes in the large system limit.
The statistical mechanics of synchronous local processes on graphsread_more
HG G 19.1
28 November 2023
14:15-15:15
Prof. Dr. Christian Brennecke
University of Bonn
Event Details

DACO Seminar

Title Operator Norm Bounds on the Correlation Matrix of the SK Model
Speaker, Affiliation Prof. Dr. Christian Brennecke, University of Bonn
Date, Time 28 November 2023, 14:15-15:15
Location HG G 19.1
Abstract In this talk I will review basic predictions for the high temperature regime, the so called replica symmetric regime, of the Sherrington-Kirkpatrick mean field spin glass. I will recall the TAP equations and their derivation in connection with the decay of the two point correlation functions. For the simplified case of vanishing external field, I will present some details on recent results that characterize the susceptibility of the model as a resolvent of the interaction matrix, which predicts in a simple way the (well-known) RS-RSB transition temperature. The talk is based on joint work with Adrien Schertzer, Changji Xu and Horng-Tzer Yau.
Operator Norm Bounds on the Correlation Matrix of the SK Modelread_more
HG G 19.1

Note: events marked with an asterisk (*) indicate that the time and/or location are different from the usual time and/or location.

JavaScript has been disabled in your browser