麻豆传媒

Modern Theory of Gradient-Based Optimization

活动时间:2026-04-24 13:30

活动地点:松江校区二号学院楼 2432 报告厅

主讲人:史斌

主讲人中文简介:

史斌老师现任复旦大学与上海数学与交叉学科研究院长聘副教授。此前,他曾在中国科学院数学与系统科学研究院担任副教授,并在加州大学伯克利分校从事博士后研究。他的研究方向包括机器学习优化、数值分析和科学计算。

活动内容摘要:

In recent years, the close connection between discrete optimization algorithms and continuous differential equations has been increasingly recognized. Using differential equations as a perspective to study discrete optimization methods has become a vibrant research topic. In this talk, I will introduce a Lyapunov analysis framework that has been systematically developed to serve as a bridge between continuous dynamics and discrete algorithms. This framework integrates tools from physics—such as dimensional analysis—and techniques from asymptotic analysis in differential equations to construct continuous-time models for gradient-based optimization algorithms, including accelerated gradient methods, ADMM, and PDHG. Through this approach, why and how gradient descent can be accelerated, how acceleration extends naturally to proximal settings, and what fundamentally distinguishes gradient descent from ADMM, have been uncovered. I will present these recent theoretical developments.

主持人:查冬兵