Research reports

On the approximation of functions by tanh neural networks

by T. De Ryck and S. Lanthaler and S. Mishra

(Report number 2021-14)

Abstract
We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks.

Keywords: deep learning, neural networks, tanh, function approximation

BibTeX
@Techreport{DLM21_956,
  author = {T. De Ryck and S. Lanthaler and S. Mishra},
  title = {On the approximation of functions by tanh neural networks},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2021-14},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2021/2021-14.pdf },
  year = {2021}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser