Research reports

Analysis of the generalization error: Empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations

by J. Berner and Ph. Grohs and A. Jentzen

(Report number 2018-31)

Abstract
The development of new classification and regression algorithms based on empirical risk minimization (ERM) over deep neural network hypothesis classes, coined Deep Learning, revolutionized the area of artificial intelligence, machine learning, and data analysis. More recently, these methods have been applied to the numerical solution of high dimensional partial differential equations (PDEs) with great success. In particular, recent simulations indicate that deep learning based algorithms are capable of overcoming the curse of dimensionality for the numerical solution of linear Kolmogorov PDEs.
Kolmogorov PDEs have been widely used in models from engineering, finance, and the natural sciences. In particular Kolmogorov PDEs are highly employed in models for the approximative pricing of financial derivatives. Nearly all approximation methods for Kolmogorov PDEs in the literature suffer under the curse of dimensionality. By contrast, in recent work by some of the authors it was shown that deep ReLU neural networks are capable of approximating solutions of Kolmogorov PDEs without incurring the curse of dimensionality. The present paper considerably strengthens these results by providing an analysis of the generalization error. In particular we show that for Kolmogorov PDEs with affine drift and diffusion coefficients and a given accuracy \(\varepsilon >0\), ERM over deep neural network hypothesis classes of size scaling polynomially in the dimension \(d\) and \(\varepsilon^{-1}\) and with a number of training samples scaling polynomially in the dimension \(d\) and \(\varepsilon^{-1}\) approximates the solution of the Kolmogorov PDE to within accuracy \(\varepsilon\) with high probability. We conclude that ERM over deep neural network hypothesis classes breaks the curse of dimensionality for the numerical solution of linear Kolmogorov PDEs with affine drift and diffusion coefficients. To the best of our knowledge this is the first rigorous mathematical result that proves the efficiency of deep learning methods for high dimensional problems.

Keywords:

BibTeX
@Techreport{BGJ18_785,
  author = {J. Berner and Ph. Grohs and A. Jentzen},
  title = {Analysis of the generalization error: Empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2018-31},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2018/2018-31.pdf },
  year = {2018}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser