> simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > > simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > Research reports – Seminar for Applied Mathematics | ETH Zurich

Research reports

Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations

by L. Gonon and Ch. Schwab

(Report number 2021-08)

Abstract
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension \(d\).Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump Lévy processes. We prove for such PIDEs arising from a class of jump-diffusions on \({\mathbb R}^d\), that for any compact \(K\subset {\mathbb R}^d\), there exist constants \(C,{\mathfrak{p}},{\mathfrak{q}}>0\) such that for every \(\varepsilon \in (0,1]\) and for every \(d\in {\mathbb N}\) the normalized (over \(K\)) DNN \(L^2\)-expression error of viscosity solutions of the PIDE is of size \(\varepsilon\) with DNN size bounded by \(Cd^{\mathfrak{p}}\varepsilon^{-\mathfrak{q}}\). In particular, the constant \(C>0\) is independent of \(d\in {\mathbb N}\) and of \(\varepsilon \in (0,1]\) and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.

Keywords:

BibTeX
@Techreport{GS21_950,
  author = {L. Gonon and Ch. Schwab},
  title = {Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2021-08},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2021/2021-08.pdf },
  year = {2021}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser