DACO seminar

×

Modal title

Modal content

Bitte abonnieren Sie hier, wenn Sie über diese Veranstaltungen per E-Mail benachrichtigt werden möchten. Ausserdem können Sie auch den iCal/ics-Kalender abonnieren.

Herbstsemester 2020

Datum / Zeit Referent:in Titel Ort
7. Oktober 2020
17:15-18:15
Prof. Dr. Jelani Nelson
University of California at Berkeley, USA
Details

DACO Seminar

Titel Optimal bounds for approximate counting
Referent:in, Affiliation Prof. Dr. Jelani Nelson, University of California at Berkeley, USA
Datum, Zeit 7. Oktober 2020, 17:15-18:15
Ort Zoom Meeting
Abstract Counting up to N deterministically of course takes Theta(log N) bits. In the first ever streaming algorithm, Morris in 1978 gave a randomized algorithm that improved upon this bound exponentially. The best known analysis of his algorithm shows that it gives a 1+eps approximation to N with probability at least 1-delta using O(loglog N + log(1/eps) + log(1/delta)) bits with high probability (the space usage is itself a random variable). We show that a very slight (but necessary) tweak of his algorithm actually achieves the better bound O(loglog N + log(1/eps) + loglog(1/delta)) bits, and we also show a new matching lower bound, establishing optimality. Joint work with Huacheng Yu.
Optimal bounds for approximate countingread_more
Zoom Meeting
14. Oktober 2020
17:15-18:15
Prof. Dr. Gabor Lugosi
Pompeu Fabra University, Barcelona, Spain
Details

DACO Seminar

Titel On estimating the mean of a random vector
Referent:in, Affiliation Prof. Dr. Gabor Lugosi, Pompeu Fabra University, Barcelona, Spain
Datum, Zeit 14. Oktober 2020, 17:15-18:15
Ort Zoom Meeting
Abstract One of the most basic problems in statistics is the estimation of the mean of a random vector, based on independent observations. This problem has received renewed attention in the last few years, both from statistical and computational points of view. In this talk we review some recent results on the statistical performance of mean estimators that allow heavy tails and adversarial contamination in the data. The basic punchline is that one can construct estimators that, under minimal conditions, achieve the same kind of performance as if the data was Gaussian. The material of this talk is based on a series of joint papers with Shahar Mendelson.
On estimating the mean of a random vectorread_more
Zoom Meeting
21. Oktober 2020
17:15-18:15
Prof. Dr. Elizaveta (Liza) Levina
University of Michigan, Ann Arbor, USA
Details

DACO Seminar

Titel Hierarchical community detection by recursive partitioning
Referent:in, Affiliation Prof. Dr. Elizaveta (Liza) Levina , University of Michigan, Ann Arbor, USA
Datum, Zeit 21. Oktober 2020, 17:15-18:15
Ort Zoom Meeting
Abstract Community detection in networks has been extensively studied in the form of finding a single partition into a “correct” number of communities. In large networks, however, a multi-scale hierarchy of communities is much more realistic. We show that a hierarchical tree of communities, obviously more interpretable, is also potentially more accurate and more computationally efficient. We construct this tree with a simple top-down recursive algorithm, at each step splitting the nodes into two communities with a non-iterative spectral algorithm, until a stopping rule suggests there are no more communities. The algorithm is model-free, extremely fast, and requires no tuning other than selecting a stopping rule. We propose a natural model for this setting, a binary tree stochastic block model, and prove that the algorithm correctly recovers the entire community tree under relatively mild assumptions. As a by-product, we obtain explicit and intuitive results for fitting the stochastic block model under model misspecification. We illustrate the algorithm on a statistics papers dataset constructing a highly interpretable tree of statistics research communities, and on a network based on gene co-occurrence in research papers on anemia. Joint work with Tianxi Li, Lihua Lei, Sharmodeep Bhattacharyya, Koen van de Berge, Purnamrita Sarkar, and Peter Bickel.
Hierarchical community detection by recursive partitioningread_more
Zoom Meeting
11. November 2020
17:15-18:15
Prof. Dr. Yuanzhi Li
Carnegie Mellon University, Pittsburgh, USA
Details

DACO Seminar

Titel Backward Feature Correction: How can Deep Learning perform Deep Learning?
Referent:in, Affiliation Prof. Dr. Yuanzhi Li, Carnegie Mellon University, Pittsburgh, USA
Datum, Zeit 11. November 2020, 17:15-18:15
Ort Zoom Meeting
Abstract How does a 110-layer ResNet learn a high-complexity classifier using relatively few training examples and short training time? We present a theory towards explaining this deep learning process in terms of hierarchical learning. We refer to hierarchical learning as the learner learns to represent a complicated target function by decomposing it into a sequence of simpler functions, to reduce sample and time complexity. This work formally analyzes how multi-layer neural networks can perform such hierarchical learning efficiently and automatically simply by applying stochastic gradient descent (SGD) to the training objective, especially when other “shallow” models provably fail to learn the concept class efficiently due to the lack of hierarchy. In particular, we establish a new principle called “backward feature correction” to show how the features in the lower-level layers in the network can also be improved via training together with higher-level layers, which we believe is the key to understand the deep learning process in multi-layer neural networks. We also present empirical evidences supporting our theorem, in particular, we show “how much, how deep” the “backwards” (i.e. the improvement of lower level layers in a neural network due to the gradient from higher level layers) in a multi-layer neural network needs to be, and which part of the lower level features are getting improved through the “backwards”.
Backward Feature Correction: How can Deep Learning perform Deep Learning?read_more
Zoom Meeting
25. November 2020
17:15-18:15
Prof. Dr. Constantinos Daskalakis
MIT, USA
Details

DACO Seminar

Titel Equilibrium Computation and the Foundations of Deep Learning
Referent:in, Affiliation Prof. Dr. Constantinos Daskalakis, MIT, USA
Datum, Zeit 25. November 2020, 17:15-18:15
Ort Zoom Meeting
Abstract Deep Learning has recently yielded important advances in single-agent learning challenges, much of that progress being fueled by the empirical success of gradient descent and its variants in computing local optima of non-convex optimization problems. In multi-agent learning applications, the role of single-objective optimization is played by equilibrium computation, yet our understanding of its complexity in settings that are relevant for Deep Learning remains sparse. In this talk we focus on min-max optimization of nonconvex-nonconcave objectives, which has found applications in GANs, and other adversarial learning problems. Here, not only are there no known gradient-descent based methods converging to even local and approximate min-max equilibria, but the computational complexity of identifying them remains poorly understood. We show that finding approximate local min-max equilibria of Lipschitz and smooth objectives requires a number of queries to the function and its gradient that is exponential in the relevant parameters, in sharp contrast to the polynomial number of queries required to find approximate local minima of non-convex objectives. Our oracle lower bound is a byproduct of a complexity-theoretic result showing that finding approximate local min-max equilibria is computationally equivalent to finding Brouwer fixed points, and Nash equilibria in non zero-sum games, and thus PPAD-complete. Minimal complexity theory knowledge will be assumed in the talk. Joint work with Stratis Skoulakis and Manolis Zampetakis.
Equilibrium Computation and the Foundations of Deep Learningread_more
Zoom Meeting
2. Dezember 2020
17:15-18:15
Prof. Dr. Yury Polyanskiy
Princeton University, USA
Details

DACO Seminar

Titel Self-regularizing Property of Nonparametric Maximum Likelihood Estimator in Mixture Models
Referent:in, Affiliation Prof. Dr. Yury Polyanskiy, Princeton University, USA
Datum, Zeit 2. Dezember 2020, 17:15-18:15
Ort Zoom Meeting
Abstract Introduced by Kiefer and Wolfowitz 1956, the nonparametric maximum likelihood estimator (NPMLE) is a widely used methodology for learning mixture models and empirical Bayes estimation. Sidestepping the non-convexity in mixture likelihood, the NPMLE estimates the mixing distribution by maximizing the total likelihood over the space of probability measures, which can be viewed as an extreme form of over parameterization. In this work we discover a surprising property of the NPMLE solution. Consider, for example, a Gaussian mixture model on the real line with a subgaussian mixing distribution. Leveraging complex-analytic techniques, we show that with high probability the NPMLE based on a sample of size n has O(\log n) atoms (mass points), significantly improving the deterministic upper bound of n due to Lindsay (1983). Notably, any such Gaussian mixture is statistically indistinguishable from a finite one with O(\log n) components (and this is tight for certain mixtures). Thus, absent any explicit form of model selection, NPMLE automatically chooses the right model complexity, a property we term self-regularization. Extensions to other exponential families are given. As a statistical application, we show that this structural property can be harnessed to bootstrap existing Hellinger risk bound of the (parametric) MLE for finite Gaussian mixtures to the NPMLE for general Gaussian mixtures, recovering a result of Zhang (2009). Time permitting, we will discuss connections to approaching the optimal regret in empirical Bayes. This is based on joint work with Yihong Wu (Yale).
Self-regularizing Property of Nonparametric Maximum Likelihood Estimator in Mixture Modelsread_more
Zoom Meeting
9. Dezember 2020
17:15-18:15
Dr. Seigal Anna
University of Oxford, UK
Details

DACO Seminar

Titel Algebraic methods in statistics and data analysis
Referent:in, Affiliation Dr. Seigal Anna, University of Oxford, UK
Datum, Zeit 9. Dezember 2020, 17:15-18:15
Ort Zoom Meeting
Abstract Algebraic structure is at the heart of many problems in statistics and data analysis. We aim to fit data to a model, or to approximate data by a point on some locus of interest. I will discuss how algebraic structure can be used to capture the existence and uniqueness of a solution to these problems, as well as to suggest suitable algorithms. I will first consider parameter estimation in statistical models via maximum likelihood estimation. We will see connections between maximum likelihood estimation and invariant theory. I will then discuss tensors, the higher dimensional analogues of matrices. The loci of tensors that are of interest in applications often define semi-algebraic sets, given by polynomial equations and inequalities. One example is the signature, tensors that can be used to encode a path of time series data. We will see the algebraic structure that relates a path to its signature.
Algebraic methods in statistics and data analysisread_more
Zoom Meeting
16. Dezember 2020
17:15-18:15
Prof. Dr. Amin Coja-Oghlan
Goethe Universität, Frankfurt
Details

DACO Seminar

Titel Optimal group testing
Referent:in, Affiliation Prof. Dr. Amin Coja-Oghlan, Goethe Universität, Frankfurt
Datum, Zeit 16. Dezember 2020, 17:15-18:15
Ort Zoom Meeting
Abstract In group testing the aim is to identify a small set of infected individuals within a large population. At our disposal we have a test procedure that can be applied to a group of individuals, with the test returning a positive result if at least one individual in the group is infected. All tests are conducted in parallel. The aim is to devise a (possibly randomised) test design with as few tests as possible that identifies the infected individuals with high probability. We show that there occurs a sharp information-theoretic/algorithmic phase transition as the number of tests passes an explicit threshold. The talk is based on joint work with Oliver Gebhard, Max Hahn-Klimroth and Philipp Loick.
Optimal group testingread_more
Zoom Meeting

Organisatoren:innen: Afonso Bandeira, MaD group at NYU

JavaScript wurde auf Ihrem Browser deaktiviert