This seminar will take place online, via Zoom. Please contact the APAM Department for the Zoom link.

**Speaker:** Arnulf Jentzen, University of Münster

**Title: **"Convergence analysis for gradient descent optimization methods in the training of artificial neural networks"

**Abstract:** Gradient descent (GD) type optimization methods are the standard instrument to train artificial neural networks (ANNs) with rectified linear unit (ReLU) activation. Despite the great success of GD type optimization methods in numerical simulations for the training of ANNs with ReLU activation, it remains -- even in the simplest situation of the plain vanilla GD optimization method with random initializations -- an open problem to prove (or disprove) the conjecture that the true risk of the GD optimization method converges in the training of ANNs with ReLU activation to zero as the width/depth of the ANNs, the number of independent random initializations, and the number of GD steps increase to infinity. In this talk we prove this conjecture in the situation where the probability distribution of the input data is equivalent to the continuous uniform distribution on a compact interval, where the probability distributions for the random initializations of the ANN parameters are standard normal distributions, and where the target function under consideration is continuous and piecewise affine linear.

**Biography:** Arnulf Jentzen (*November 1983) is appointed as a full professor at the University of Muenster (since 2019) and as a presidential chair professor at the Chinese University of Hong Kong, Shenzhen (since 2021). In 2004 he started his undergraduate studies in mathematics (minor field of study: computer science) at Goethe University Frankfurt in Germany, in 2007 he received his diploma degree at this university, and in 2009 he completed his PhD in mathematics at this university. The core research topics of his research group are machine learning approximation algorithms, computational stochastics, numerical analysis for high dimensional partial differential equations (PDEs), stochastic analysis, and computational finance. He is particularly interested in deep learning based algorithms for high dimensional approximation problems and different kinds of differential equations. At the moment he serves as an associate or division editor for the *Annals of Applied Probability* (AAP, since 2019), for *Communications in Computational Physic*s (CiCP, since 2021s), for *Communications in Mathematical Sciences* (CMS, since 2015), for *Discrete and Continuous Dynamical Systems Series B* (DCDS-B, since 2018), for the *Journal of Complexity* (JoC, since 2016), for the *Journal of Mathematical Analysis and Applications* (JMAA, since 2014), for the *SIAM / ASA Journal on Uncertainty Quantification* (JUQ, since 2020), for the *SIAM Journal on Scientific Computing* (SISC, since 2020), for the* SIAM Journal on Numerical Analysis *(SINUM, since 2016), for the Journal Springer Nature *Partial Differential Equations and Applications* (PDEA, since 2019), and for the *Journal of Applied Mathematics and Physics* (ZAMP, since 2016). In 2020 he has been awarded the Felix Klein Prize from European Mathematical Society (EMS). Further details on the activities of his research group can be found at the webpage http://www.ajentzen.de.