Research reports
Years: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 1992 1991
Deep Operator Network Approximation Rates for Lipschitz Operators
by Ch. Schwab and A. Stein and J. Zech
(Report number 2023-30)
Abstract
We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or Hölder) continuous maps \(\mathcal G:\mathcal X\to\mathcal Y\) between (subsets of) separable Hilbert spaces \(\mathcal X\), \(\mathcal Y\). The DON architecture considered uses linear encoders \(\mathcal E\) and decoders \(\mathcal D\) via (biorthogonal) Riesz bases of \(\mathcal X\), \(\mathcal Y\), and an approximator network of an infinite-dimensional, parametric coordinate map that is Lipschitz continuous on the sequence space \(\ell^2(\mathbb N)\).
Unlike previous works ([Herrmann, Schwab and Zech: Neural and Spectral operator surrogates: construction and expression rate bounds, SAM Report, 2022], [Marcati and Schwab: Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations, SAM Report, 2022]), which required for example \(\mathcal G\) to be holomorphic, the present expression rate results require mere Lipschitz (or Hölder) continuity of \(\mathcal G\).
Key in the proof of the present expression rate bounds is the use of either super-expressive activations (e.g. [Yarotski: Elementary superexpressive activations, Int. Conf. on ML, 2021], [Shen, Yang and Zhang: Neural network approximation: Three hidden layers are enough, Neural Networks, 2021], and the references there) which are inspired by the Kolmogorov superposition theorem, or of nonstandard NN architectures with standard (ReLU) activations as recently proposed in [Zhang, Shen and Yang: Neural Network Architecture Beyond Width and Depth, Adv. in Neural Inf. Proc. Sys., 2022]. We illustrate the abstract results by approximation rate bounds for emulation of a) solution operators for parametric elliptic variational inequalities, and b) Lipschitz maps of Hilbert-Schmidt operators.
Keywords: Neural Networks, Operator Learning, Curse of Dimensionality, Lipschitz Continuous Operators
BibTeX@Techreport{SSZ23_1067, author = {Ch. Schwab and A. Stein and J. Zech}, title = {Deep Operator Network Approximation Rates for Lipschitz Operators}, institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich}, number = {2023-30}, address = {Switzerland}, url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2023/2023-30.pdf }, year = {2023} }
Disclaimer
© Copyright for documents on this server remains with the authors.
Copies of these documents made by electronic or mechanical means including
information storage and retrieval systems, may only be employed for
personal use. The administrators respectfully request that authors
inform them when any paper is published to avoid copyright infringement.
Note that unauthorised copying of copyright material is illegal and may
lead to prosecution. Neither the administrators nor the Seminar for
Applied Mathematics (SAM) accept any liability in this respect.
The most recent version of a SAM report may differ in formatting and style
from published journal version. Do reference the published version if
possible (see SAM
Publications).