On stochastic optimization and the Adam optimizer: Divergence, convergence rates, and acceleration techniques
neural networks, arXiv:2402.05155 (2024), 36 pages, to appear in SIAM/ASA J. Uncertain. Quantif. [5] S. Dereich, A. Jentzen, & A. Riekert, Sharp higher order convergence rates for the Adam optimizer, …