Research reports

Exponential ReLU DNN expression of holomorphic maps in high dimension

by J. A. A. Opschoor and Ch. Schwab and J. Zech

(Report number 2019-35)

Abstract
For a parameter dimension \(d\in\mathbb{N}\), we consider the approximation of many-parametric maps \(u: [-1,1]^d\to \mathbb{R}\) by deep ReLU neural networks. The input dimension \(d\) may possibly be large, and we assume quantitative control of the domain of holomorphy of \(u\): i.e., \(u\) admits a holomorphic extension to a Bernstein polyellipse \(\mathcal{E}_{\rho_1}\times ... \times \mathcal{E}_{\rho_d} \subset \mathbb{C}^d\) of semiaxis sums \(\rho_i>1\) containing \([-1,1]^{d}\). We establish the exponential rate \(O(\exp(-bN^{1/(d+1)}))\) of expressive power in terms of the total NN size \(N\) and of the input dimension \(d\) of the ReLU NN in \(W^{1,\infty}([-1,1]^d)\). The constant \(b>0\) depends on \((\rho_j)_{j=1}^d\) which characterizes the coordinate-wise sizes of the Bernstein-ellipses for \(u\). We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called ``rectified power unit'' (RePU) activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to \(d\)-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.

Keywords: Deep ReLU neural networks, approximation rates, exponential convergence

BibTeX
@Techreport{OSZ19_839,
  author = {J. A. A. Opschoor and Ch. Schwab and J. Zech},
  title = {Exponential ReLU DNN expression of holomorphic maps in high dimension},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2019-35},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2019/2019-35.pdf },
  year = {2019}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser