时间:10月13日下午13:30-15:00
地点:勤园21-304
摘要:
Subgradient methods, introduced by Shor and developed by Albert, Iusem, Nesterov, Polyak, Soloov, and many others, are used to solve nondifferentiable optimization problems. The major differences from the gradient descent methods (or projection-gradient methods) for differentiable optimization problems lie in the selection manners of the step-sizes. For instance, constant step-sizes for differentiable objective functions no longer work for nondifferentiable objective functions; for the latter case, diminishing step-sizes must however be adopted. In this talk, we will first review some existing projected subgradient methods and the main purpose is to discuss weak and strong convergence of projected subgradient methods in an infinite-dimensional Hilbert space. Some regularization technique for strong convergence of projected subgradient methods will particularly be presented. Extension to the proximal-subgradient method for minimizing the sum of two nondifferentiable convex functions will also be discussed.
报告人简介:
徐洪坤,杭州电子科技大学,博士生导师。2004年荣获南非数学学会杰出研究奖和教育部自然科学二等奖。2005年当选南非科学院院士,2012年当选发展中国家科学院(最初称第三世界科学院,现更名为世界科学院)院士,2014-2016年入选汤森路透全球《高被引学者》,2017年入选科睿唯安全球《高被引学者》。已发表论文230余篇。现(曾)担任20多种(10种SCI)数学杂志编委。50余次国际学术会议邀请和主旨报告。主要研究方向:非线性泛函分析、最优化理论和算法、巴拿赫空间几何理论,非线性映像迭代方法,反问题及其正则化方法,金融数学等。
地址:杭州市余杭区余杭塘路2318号勤园19号楼
邮编:311121 联系电话:0571-28865286
Copyright © 2020 杭州师范大学物理学院
公安备案号:33011002011919 浙ICP备11056902号-1