Research reports

A machine learning framework for data driven acceleration of computations of differential equations

by S. Mishra

(Report number 2018-28)

Abstract
We propose a machine learning framework to accelerate numerical computations of time-dependent ODEs and PDEs. Our method is based on recasting (generalizations of) existing numerical methods as artificial neural networks, with a set of trainable parameters. These parameters are determined in an offline training process by (approximately) minimizing suitable (possibly non-convex) loss functions by (stochastic) gradient descent methods. The proposed algorithm is designed to be always consistent with the underlying differential equation. Numerical experiments involving both linear and non-linear ODE and PDE model problems demonstrate a significant gain in computational efficiency over standard numerical methods.

Keywords: Numerical methods, Differential equations, Machine learning, deep neural networks

BibTeX
@Techreport{M18_782,
  author = {S. Mishra},
  title = {A machine learning framework for data driven acceleration of computations of differential equations},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2018-28},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2018/2018-28.pdf },
  year = {2018}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser