Research reports

Convergence rates for the stochastic gradient descent method for non-convex objective functions

by B. Fehrman and B. Gess and A. Jentzen

(Report number 2019-21)

Abstract
We prove the local convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily globally convex nor contracting objective functions. In particular, the results are applicable to simple objective functions arising in machine learning.

Keywords:

BibTeX
@Techreport{FGJ19_825,
  author = {B. Fehrman and B. Gess and A. Jentzen},
  title = {Convergence rates for the stochastic gradient descent method for non-convex objective functions},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2019-21},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2019/2019-21.pdf },
  year = {2019}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser