Heavy-Tail Phenomenon in Machine Learning

发布人:国际项目 发布日期:2024-03-28阅读次数:175

Speaker:Zhu Lingjiong, Associate Professor, Florida State University

Host:Liu Yanchu, Professor, Lingnan College

Time and Date:09:30, Dec. 21, 2023

Venue:Huang Bingli Conference Room (203), Lingnan Hall 

Language: English + Chinese

 

Abstract:

In recent years, various notions of capacity and complexity have been proposed for characterizing the generalization properties of stochastic gradient descent (SGD) in deep learning. Some of the popular notions that correlate well with the performance on unseen data are (i) the flatness of the local minimum found by SGD, which is related to the eigenvalues of the Hessian, (ii) the ratio of the stepsize to the batch-size, which essentially controls the magnitude of the stochastic gradient noise, and (iii) the tail-index, which measures the heaviness of the tails of the network weights at convergence. In this talk, we argue that these three seemingly unrelated perspectives for generalization are deeply linked to each other. We show that heavy tails can help SGD generalize better. We claim that depending on the structure of the Hessian of the loss at the minimum, and the choices of the algorithm parameters, the distribution of the SGD iterates will converge to a heavy-tailed stationary distribution. We further characterize the behavior of the tails with respect to algorithm parameters, the dimension, and the curvature. We then translate our results into insights about the behavior of SGD in deep learning. We support our theory with experiments conducted on synthetic data, fully connected, and convolutional neural networks.

Biography:

Lingjiong Zhu got his BA from University of Cambridge in 2008 and PhD from New York University in 2013. He worked at Morgan Stanley and University of Minnesota before joining the faculty at Florida State University in 2015, where he is currently an Associate Professor. His research interests include applied probability, data science, financial engineering and operations research. His works have been published in many leading outlets including Annals of Applied Probability, Bernoulli, Finance and Stochastics, ICML, INFORMS Journal on Computing, Journal of Machine Learning Research, NeurIPS, Production and Operations Management, SIAM Journal on Financial Mathematics, Stochastic Processes and their Applications, Operations Research, Review of Economics and Statistics. His research has been supported by three NSF grants and a Simons Collaboration Grant. He was a recipient of Kurt O. Friedrichs Prize for an outstanding dissertation from Courant Institute, New York University in 2013, Developing Scholar Award from Florida State University in 2022, Graduate Faculty Mentor Award from Florida State University in 2023, and MSOM iFORM SIG Best Paper Award from MSOM Society in 2023. For more information, please visit his website https://www.math.fsu.edu/People/profile.math?id=1472