Optimization seminar

×

Modal title

Modal content

Herbstsemester 2016

Datum / Zeit Referent:in Titel Ort
26. September 2016
16:00-17:00
Dr. David Adjiashvili
Institute for Operations Research of ETH Zurich
Details

Optimization Seminar

Titel Improved Approximation for Weighted Tree Augmentation with Bounded Costs (discussion seminar)
Referent:in, Affiliation Dr. David Adjiashvili, Institute for Operations Research of ETH Zurich
Datum, Zeit 26. September 2016, 16:00-17:00
Ort HG G 19.1
Improved Approximation for Weighted Tree Augmentation with Bounded Costs (discussion seminar)
HG G 19.1
17. Oktober 2016
16:00-17:00
Dr. Stefan Weltge
Institute for Operations Research of ETH Zurich
Details

Optimization Seminar

Titel Extended formulations: Constructions (discussion seminar)
Referent:in, Affiliation Dr. Stefan Weltge, Institute for Operations Research of ETH Zurich
Datum, Zeit 17. Oktober 2016, 16:00-17:00
Ort HG G 19.1
Extended formulations: Constructions (discussion seminar)
HG G 19.1
24. Oktober 2016
16:00-17:00
Silvia Di Gregorio
Università degli Studi di Padova, Padova, Italia
Details

Optimization Seminar

Titel Structure of extreme functions with discrete domain
Referent:in, Affiliation Silvia Di Gregorio, Università degli Studi di Padova, Padova, Italia
Datum, Zeit 24. Oktober 2016, 16:00-17:00
Ort HG G 19.1
Structure of extreme functions with discrete domain
HG G 19.1
31. Oktober 2016
16:00-17:00
Dr. Stephen Chestnut
Institute for Operations Research of ETH Zurich
Details

Optimization Seminar

Titel Concentration (discussion seminar)
Referent:in, Affiliation Dr. Stephen Chestnut, Institute for Operations Research of ETH Zurich
Datum, Zeit 31. Oktober 2016, 16:00-17:00
Ort HG F 26.3
Abstract In probability theory, a concentration inequality bounds how a random variable X deviates from its mean. These inequalities are extremely important for the study of randomized algorithms. When X is a sum of independent random variables, we can apply our old favorites like Chebyshev's and Chernoff's inequalities to bound the deviation of the sum from its mean. But, what if X is more complicated than a simple sum? Can we still prove that it concentrates? Very often, the answer is yes. We'll start the talk with some variants of the famous Johnson-Lindenstrauss Lemma and I'll show you how one concentration inequality plays a role in a fast randomized algorithm for linear least squares. Next I'll introduce some more general methods for proving concentration inequalities, with a focus on the Efron-Stein Inequality and the "Entropy Method". We'll see how these work by applying them to prove concentration for a monotone submodular function evaluated on a random set.
Concentration (discussion seminar)read_more
HG F 26.3

Hinweise: wenn Sie möchten, können Sie den iCal/ics-Kalender abonnieren.

JavaScript wurde auf Ihrem Browser deaktiviert