> simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > > simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > Research reports – Seminar for Applied Mathematics | ETH Zurich

Research reports

Space-time deep neural network approximations for high-dimensional partial differential equations

by F. Hornung and A. Jentzen and D. Salimova

(Report number 2020-35)

Abstract
It is one of the most challenging issues in applied mathematics to approximately solve high-dimensional partial differential equations (PDEs) and most of the numerical approximation methods for PDEs in the scientific literature suffer from the so-called curse of dimensionality in the sense that the number of computational operations employed in the corresponding approximation scheme to obtain an approximation precision \(\varepsilon >0\) grows exponentially in the PDE dimension and/or the reciprocal of \(\varepsilon\). Recently, certain deep learning based approximation methods for PDEs have been proposed and various numerical simulations for such methods suggest that deep neural network (DNN) approximations might have the capacity to indeed overcome the curse of dimensionality in the sense that the number of real parameters used to describe the approximating DNNs grows at most polynomially in both the PDE dimension \(d \in \mathbb{N}\) and the reciprocal of the prescribed approximation accuracy \(\varepsilon >0\). There are now also a few rigorous mathematical results in the scientific literature which substantiate this conjecture by proving that DNNs overcome the curse of dimensionality in approximating solutions of PDEs. Each of these results establishes that DNNs overcome the curse of dimensionality in approximating suitable PDE solutions at a fixed time point \(T >0\) and on a compact cube \([a, b]^d\) in space but none of these results provides an answer to the question whether the entire PDE solution on \([0, T] \times [a, b]^d\) can be approximated by DNNs without the curse of dimensionality. It is precisely the subject of this article to overcome this issue. More specifically, the main result of this work in particular proves for every \(a \in \mathbb{R}\), \( b \in (a, \infty)\) that solutions of certain Kolmogorov PDEs can be approximated by DNNs on the space-time region \([0, T] \times [a, b]^d\) without the curse of dimensionality.

Keywords: deep neural network, DNN, artificial neural network, ANN, curse of dimensionality, approximation, partial differential equation, PDE, stochastic differential equation, SDE, Monte Carlo Euler, Feynman-Kac formula

BibTeX
@Techreport{HJS20_908,
  author = {F. Hornung and A. Jentzen and D. Salimova},
  title = {Space-time deep neural network approximations for high-dimensional partial differential equations},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2020-35},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2020/2020-35.pdf },
  year = {2020}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser