> simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > > simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > Research reports – Seminar for Applied Mathematics | ETH Zurich

Research reports

Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in $L^2(\R^d,\gamma_d)$

by Ch. Schwab and J. Zech

(Report number 2021-40)

Abstract
For artificial deep neural networks, we prove expression rates for analytic functions \(f:{\mathbb R}^d\to {\mathbb R}\) in the norm of \(L^2({\mathbb R}^d,\gamma_d)\) where \(d\in {\mathbb N}\cup\{ \infty \}\). Here \(\gamma_d\) denotes the Gaussian product probability measure on \({\mathbb R}^d\). We consider in particular \({\mathrm{ReLU}}\) and \({\mathrm{ReLU}}^k\) activations for integer \(k\geq 2\). For \(d\in\mathbb{N}\), we show exponential convergence rates in \(L^2(\mathbb{R}^d,\gamma_d)\). In case \(d=\infty\), under suitable smoothness and sparsity assumptions on \(f:{\mathbb R}^{\mathbb N}\to {\mathbb R}\), with \(\gamma_\infty\) denoting an infinite (Gaussian) product measure on \(({\mathbb R}^{\mathbb N}, {\mathcal B}({\mathbb R}^{\mathbb N}))\), we prove dimension-independent expression rate bounds in the norm of \(L^2({\mathbb R}^{\mathbb N},\gamma_\infty)\). The rates only depend on quantified holomorphy of (an analytic continuation of) the map \(f\) to a product of strips in \({\mathbb C}^d\) (in \({\mathbb C}^{\mathbb N}\) for \(d=\infty\), respectively). As an application, we prove expression rate bounds of deep \({\mathrm{ReLU}}\)-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

Keywords:

BibTeX
@Techreport{SZ21_982,
  author = {Ch. Schwab and J. Zech},
  title = {Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in $L^2(\R^d,\gamma_d)$},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2021-40},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2021/2021-40.pdf },
  year = {2021}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser