报告题目:Projected Subgradient Methods in Infinite Dimensional Spaces
报告人:徐洪坤教授,杭州电子科技大学理学院
报告时间:2020年11月17日16:00-17:00
报告地点:bat365官网登录145报告厅
报告摘要:Subgradient methods, introduced by Shor and developed by Albert, Iusem, Nesterov, Polyak, Soloov, and many others, are used to solve nondifferentiable optimization problems. The major differences from the gradient descent methods (or projection-gradient methods) for differentiable optimization problems lie in the selection manners of the step-sizes. For instance, constant step-sizes for differentiable objective functions no longer work for nondifferentiable objective functions; for the latter case, diminishing step-sizes must however be adopted.
In this talk, we will first review some existing projected subgradient methods and the main purpose is to discuss weak and strong convergence of projected subgradient methods in an infinite-dimensional Hilbert space. Some regularization technique for strong convergence of projected subgradient methods will particularly be presented. Extension to the proximal-subgradient method for minimizing the sum of two nondifferentiable convex functions will also be discussed.
报告人简介:徐洪坤于1988年获西安交通大学博士学位,目前为杭州电子科技大学特聘教授。曾任华东理工大学讲师、副教授;西班牙塞维利亚大学访问教授;加拿大达尔豪斯大学博士后研究员;南非夸祖鲁纳塔尔大学副教授、教授、资深教授;台湾中山大学西湾讲席教授;天津市特聘讲座教授等。2004年荣获南非数学学会杰出研究奖和教育部自然科学二等奖(与徐宗本、蒋耀林共同获得)。曾任台湾中山大学应用数学系主任和理学院经理。2005年当选南非科学院院士,2012年当选发展中国家科学院院士,2014-2018入选汤森路透/科睿唯安全球高被引学者,2019入选爱思唯尔中国高被引学者。已发表论文250余篇。现(曾)任近10种SCI数学杂志编委和FPTASE共同主编。20余次国际学术会议特邀(invited/keynote/plenary)报告。主要研究兴趣包括:非线性泛函分析、最优化理论和算法、巴拿赫空间几何理论,非线性映像迭代方法,反问题及其正则化方法,金融数学等。