Master Degree Project: On the Modeling of Stochastic Gradient Descent with Stochastic Differential Equations
- Date: –15:15
- Location: Ångströmlaboratoriet, Lägerhyddsvägen 1 Å64119
- Lecturer: Martin Leino
- Organiser: Matematiska institutionen
- Contact person: Benny Avelin
Welcome to Martin Leino´s presentation of his master degree project with the title "On the Modeling of Stochastic Gradient Descent with Stochastic Differential Equations".
Stochastic gradient descent (SGD) is arguably the most important algorithm used in optimization problems for large-scale machine learning. Its behaviour has been studied extensively from the viewpoint of mathematical analysis and probability theory; it is widely held that in the limit where the learning rate in the algorithm tends to zero, a specific stochastic differential equation becomes an adequate model of the dynamics of the algorithm. This study exhibits some of the research in this field by analyzing the application of a recently proven theorem to the problem of tensor principal component analysis. The results, originally discovered in an article by Gérard Ben Arous, Reza Gheissari and Aukosh Jagannath from 2022, illustrate how the phase diagram of functions of SGD differ in the high-dimensional regime from that of the classical fixed-dimensional setting.