统计学系系列讲座之325期

 

时 间:2018年11月8日(星期四)15:30--16:30

地 点:李达三楼104室

主持人:张新生 教授 复旦大学管理学院统计学系

主 题:Computational and Statistical Analysis of Stochastic Gradient Descent with Implication for Deep Learning

主讲人:Professor Yazhen Wang(王亚珍), Department of Statistics, University of Wisconsin-Madison

简 介:王亚珍,威斯康辛大学麦迪逊分校统计系教授,系主任, 1992获加州大学伯克利分校博士学位。主要从事金融统计和时间序列等方面的研究。在国际学术刊物发表论文100余篇,其中包括在AOS、JASA、JOE、JBES等国际顶级的科学期刊上的论文二十余篇。现为国际数理统计学会(IMS)的Fellow、美国统计学会(ASA)的Fellow。Stat. Interface主编,JASA、Ann. Statist.、Ann. Appl. Stat.、J. Bus. Econom. Statist.、Statist. Sinica、Econom. J.等杂志的Associate Editor。

摘 要:Gradient descent algorithms such as accelerated gradient descent and stochastic gradient descent are widely employed to solve optimization problems in statistics and machine learning. This talk will present a new asymptotic analysis of these algorithms by continuous-time ordinary or stochastic differential equations. I will illustrate that the analysis can provide a novel unified framework for a joint computational and statistical asymptotic analysis on dynamic behaviors of these algorithms with the number of iterations in the algorithms and large sample behaviors of the optimization solutions (i.e. statistical decision rules like estimators and classifiers) that the algorithms are applied to compute. I will also discuss the implication of the analysis results for deep learning.

 

 

                                                           统计学系 

2018-11-7