ETH-FDS seminar series

More information about ETH Foundations of Data Science can be found here

×

Modal title

Modal content

Please subscribe here if you would you like to be notified about these events via e-mail. Moreover you can also subscribe to the iCal/ics Calender.

Spring Semester 2024

Date / Time Speaker Title Location
21 March 2024
16:15-17:15
Bryon Aragam
The University of Chicago Booth School of Business
Details

ETH-FDS seminar

Title Research Seminar on Statistics - FDS Seminar joint talk: Statistical aspects of nonparametric latent variable models and causal representation learning
Speaker, Affiliation Bryon Aragam, The University of Chicago Booth School of Business
Date, Time 21 March 2024, 16:15-17:15
Location HG D 1.2
Abstract One of the key paradigm shifts in statistical machine learning over the past decade has been the transition from handcrafted features to automated, data-driven representation learning. A crucial step in this pipeline is to identify latent representations from observational data along with their causal structure. In many applications, the causal variables are not directly observed, and must be learned from data, often using flexible, nonparametric models such as deep neural networks. These settings present new statistical and computational challenges that will be focus of this talk. We will re-visit the statistical foundations of nonparametric latent variable models as a lens into the problem of causal representation learning. We discuss our recent work on developing methods for identifying and learning causal representations from data with rigourous guarantees, and discuss how even basic statistical properties are surprisingly subtle. Along the way, we will explore the connections between causal graphical models, deep generative models, and nonparametric mixture models, and how these connections lead to a useful new theory for causal representation learning.
Research Seminar on Statistics - FDS Seminar joint talk: Statistical aspects of nonparametric latent variable models and causal representation learningread_more
HG D 1.2
15 May 2024
11:30-12:30
Aryan Mokhtari
UT Austin
Details

ETH-FDS seminar

Title Online Learning Guided Quasi-Newton Methods: Improved Global Non-asymptotic Guarantees
Speaker, Affiliation Aryan Mokhtari, UT Austin
Date, Time 15 May 2024, 11:30-12:30
Location CAB H 53
Abstract Quasi-Newton (QN) methods are popular iterative algorithms known for their superior practical performance compared to Gradient Descent (GD)-type methods. However, the existing theoretical results for this class of algorithms do not sufficiently justify their advantage over GD-type methods. In this talk, we discuss our recent efforts to address this issue. Specifically, in the strongly convex setting, we propose the first “globally” convergent QN method that achieves an explicit “non-asymptotic superlinear” rate. We show that the rate presented for our method is provably faster than GD after at most $O(d)$ iterations, where $d$ is the problem dimension. Additionally, in the convex setting, we present an accelerated variant of our proposed method that provably outperforms the accelerated gradient method and converges at a rate of $O(\min\{1/k^2, \sqrt{d \log k }/ k^{2.5}\})$, where $k$ is the number of iterations. To attain these results, we diverge from conventional approaches and construct our QN methods based on the Hybrid Proximal Extragradient (HPE) framework and its accelerated variants. Furthermore, a pivotal algorithmic concept underpinning our methodologies is an online learning framework for updating the Hessian approximation matrices. Specifically, we relate our method's convergence rate to the regret of a specific online convex optimization problem in the matrix space and choose the sequence of Hessian approximation matrices to minimize its overall regret. Bio: Aryan Mokhtari is an Assistant Professor in the Electrical and Computer Engineering (ECE) Department at the University of Texas at Austin (UT Austin), where he holds the title of Fellow of Texas Instruments/Kilby. Prior to joining UT Austin, he was a Postdoctoral Associate in the Laboratory for Information and Decision Systems (LIDS) at MIT. Before that, he worked as a Research Fellow at the Simons Institute for the program on “Bridging Continuous and Discrete Optimization.” He earned his Ph.D. in electrical and systems engineering from the University of Pennsylvania (Penn). Aryan has received the NSF CAREER Award, the Army Research Office (ARO) Early Career Program Award, Google Research Scholar Award, UT Austin ECE Department’s Junior Faculty Excellence in Teaching Award, the Simons-Berkeley Research Fellowship, and Penn’s Joseph and Rosaline Wolf Award for Best Doctoral Dissertation.
Online Learning Guided Quasi-Newton Methods: Improved Global Non-asymptotic Guaranteesread_more
CAB H 53
JavaScript has been disabled in your browser