Research reports
Years: 2025 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 1992 1991
Deep Solution Operators for Variational Inequalities via Proximal Neural Networks
by Ch. Schwab and A. Stein
(Report number 2021-37)
Abstract
We introduce ProxNet, a collection of deep neural networks with ReLU activation
which emulate numerical solution operators of variational inequalities (VIs).
We analyze the expression rates of ProxNets in emulating solution operators for variational inequality problems posed on closed, convex cones in real, separable Hilbert spaces, covering the classical contact problems
in mechanics, and early exercise problems as arise, e.g. in valuation of American-style
contracts in Black-Scholes financial market models.
In the finite-dimensional setting, the VIs reduce to matrix VIs in Euclidean space,
and ProxNets emulate classical projected matrix iterations,
such as projected Jacobi and projected SOR methods.
Keywords: Deep Neural Networks, Deep Solution Operators, Variational Inequalities, Proximity Operators, ReLU
BibTeX@Techreport{SS21_979, author = {Ch. Schwab and A. Stein}, title = {Deep Solution Operators for Variational Inequalities via Proximal Neural Networks}, institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich}, number = {2021-37}, address = {Switzerland}, url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2021/2021-37.pdf }, year = {2021} }
Disclaimer
© Copyright for documents on this server remains with the authors.
Copies of these documents made by electronic or mechanical means including
information storage and retrieval systems, may only be employed for
personal use. The administrators respectfully request that authors
inform them when any paper is published to avoid copyright infringement.
Note that unauthorised copying of copyright material is illegal and may
lead to prosecution. Neither the administrators nor the Seminar for
Applied Mathematics (SAM) accept any liability in this respect.
The most recent version of a SAM report may differ in formatting and style
from published journal version. Do reference the published version if
possible (see SAM
Publications).