Research reports
Years: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 1992 1991
Deep learning in high dimension: ReLU neural network expression for Bayesian PDE inversion (extended version)
by J. A. A. Opschoor and Ch. Schwab and J. Zech
(Report number 2020-47)
Abstract
We establish dimension independent expression rates by deep ReLU networks for certain countably-parametric maps, so-called \((\boldsymbol{b},\varepsilon,\mathcal{X})\)-holomorphic functions. These are mappings from \([-1,1]^{\mathbb{N}}\to \mathcal{X}\), with \(\mathcal{X}\) being a Banach space, that admit analytic extensions to certain polyellipses in each of the input variables. Parametric maps of this type occur in uncertainty quantification for partial differential equations with uncertain inputs from function spaces, upon the introduction of bases. For such maps, we prove (constructive) expression rate bounds by families of deep neural networks, based on multilevel polynomial chaos expansions. We show that \((\boldsymbol{b},\varepsilon,\mathcal{X})\)-holomorphy implies summability and sparsity of coefficients in generalized polynomial chaos expansions. This, in turn, implies deep neural network expression rate bounds.
We apply the results to Bayesian inverse problems for partial differential equations with distributed, uncertain inputs from Banach spaces. Our results imply the existence of ``neural Bayesian posteriors'' emulating the posterior densities with expression rate bounds that are free from the curse of dimensionality, and limited only by sparsity of certain gpc expansions. We prove the neural Bayesian posteriors robust in large data or small noise asymptotics (e.g. [B. T. Knapik and A. W. van der Vaart and J. H. van Zanten, 2011]) which can be emulated in a noise-robust fashion.
Keywords: Bayesian Inverse Problems, Generalized polynomial chaos, Deep networks, Uncertainty Quantification
BibTeX@Techreport{OSZ20_920, author = {J. A. A. Opschoor and Ch. Schwab and J. Zech}, title = {Deep learning in high dimension: ReLU neural network expression for Bayesian PDE inversion (extended version)}, institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich}, number = {2020-47}, address = {Switzerland}, url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2020/2020-47.pdf }, year = {2020} }
Disclaimer
© Copyright for documents on this server remains with the authors.
Copies of these documents made by electronic or mechanical means including
information storage and retrieval systems, may only be employed for
personal use. The administrators respectfully request that authors
inform them when any paper is published to avoid copyright infringement.
Note that unauthorised copying of copyright material is illegal and may
lead to prosecution. Neither the administrators nor the Seminar for
Applied Mathematics (SAM) accept any liability in this respect.
The most recent version of a SAM report may differ in formatting and style
from published journal version. Do reference the published version if
possible (see SAM
Publications).