> simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > > simulation by means of second-kind Galerkin boundary element method.>> Source: Elke Spindler "Second-Kind Single Trace Boundary Integral>> Formulations for Scattering at Composite Objects", ETH Diss 23620, 2016."" > Research reports – Seminar for Applied Mathematics | ETH Zurich

Research reports

Deep ReLU networks and high-order finite element methods II: Chebyshev emulation

by J. A. A. Opschoor and Ch. Schwab

(Report number 2023-38)

Abstract
We show expression rates and stability in Sobolev norms of deep feedforward ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions \(\mathcal{T}\) of a bounded interval \((a,b)\). Novel constructions of ReLU NN surrogates encoding function approximations in terms of Chebyshev polynomial expansion coefficients are developed which require fewer neurons than previous constructions. Chebyshev coefficients can be computed easily from the values of the function in the Clenshaw--Curtis points using the inverse fast Fourier transform. Bounds on expression rates and stability are obtained that are superior to those of constructions based on ReLU NN emulations of monomials as considered in [Opschoor, Petersen and Schwab, 2020] and [Montanelli, Yang and Du, 2021]. All emulation bounds are explicit in terms of the (arbitrary) partition of the interval, the target emulation accuracy and the polynomial degree in each element of the partition. ReLU NN emulation error estimates are provided for various classes of functions and norms, commonly encountered in numerical analysis. In particular, we show exponential ReLU emulation rate bounds for analytic functions with point singularities and develop an interface between Chebfun approximations and constructive ReLU NN emulations.

Keywords: Neural Networks, hp-Finite Element Methods, Chebyshev Expansions

BibTeX
@Techreport{OS23_1075,
  author = {J. A. A. Opschoor and Ch. Schwab},
  title = {Deep ReLU networks and high-order finite element methods II: Chebyshev emulation},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2023-38},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2023/2023-38.pdf },
  year = {2023}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser