数学学科Seminar第2622讲 凸与非凸优化的一致最优性

创建时间:  2024/01/03  龚惠英   浏览次数:   返回

报告题目 (Title):凸与非凸优化的一致最优性

报告人 (Speaker):Guanghui Lan 教授(美国佐治亚理工学院)

报告时间 (Time):2024年1月05日 (周五) 15:00

报告地点 (Place):校本部F309

邀请人(Inviter):徐姿 教授

主办部门:古天乐代言太阳集团数学系

报告摘要:The past few years have witnessed growing interest in the development of easily implementable parameter-free first-order methods to facilitate their applications, e.g., in data science and machine learning. In this talk, I will discuss some recent progresses that we made on uniformly optimal methods for convex and nonconvex optimization. By uniform optimality, we mean that these algorithms do not require the input of any problem parameters but can still achieve the best possible iteration complexity bounds for solving different classes of optimization problems. We first consider convex optimization problems under different smoothness levels and show that neither such smoothness information nor line search procedures are needed to achieve uniform optimality. We then consider regularity conditions (e.g., strong convexity and lower curvature) that are imposed over a global scope and thus are notoriously more difficult to estimate. By presenting novel methods that can achieve tight complexity bounds to compute solutions with verifiably small (projected) gradients, we show that such regularity information is in fact superfluous for handling strongly convex and nonconvex problems. It is worth noting that our complexity bound for nonconvex problems also appears to be new in the literature.

上一条:数学学科Seminar第2623讲 实现有限元方法的通用编码框架

下一条:数学学科Seminar第2620讲 具有对数势的Cahn-Hilliard 方程的保正且能量稳定的数值格式


数学学科Seminar第2622讲 凸与非凸优化的一致最优性

创建时间:  2024/01/03  龚惠英   浏览次数:   返回

报告题目 (Title):凸与非凸优化的一致最优性

报告人 (Speaker):Guanghui Lan 教授(美国佐治亚理工学院)

报告时间 (Time):2024年1月05日 (周五) 15:00

报告地点 (Place):校本部F309

邀请人(Inviter):徐姿 教授

主办部门:古天乐代言太阳集团数学系

报告摘要:The past few years have witnessed growing interest in the development of easily implementable parameter-free first-order methods to facilitate their applications, e.g., in data science and machine learning. In this talk, I will discuss some recent progresses that we made on uniformly optimal methods for convex and nonconvex optimization. By uniform optimality, we mean that these algorithms do not require the input of any problem parameters but can still achieve the best possible iteration complexity bounds for solving different classes of optimization problems. We first consider convex optimization problems under different smoothness levels and show that neither such smoothness information nor line search procedures are needed to achieve uniform optimality. We then consider regularity conditions (e.g., strong convexity and lower curvature) that are imposed over a global scope and thus are notoriously more difficult to estimate. By presenting novel methods that can achieve tight complexity bounds to compute solutions with verifiably small (projected) gradients, we show that such regularity information is in fact superfluous for handling strongly convex and nonconvex problems. It is worth noting that our complexity bound for nonconvex problems also appears to be new in the literature.

上一条:数学学科Seminar第2623讲 实现有限元方法的通用编码框架

下一条:数学学科Seminar第2620讲 具有对数势的Cahn-Hilliard 方程的保正且能量稳定的数值格式