报告题目: Error estimates of residual minimization using neural networks for linear PDEs
报告人:Professor Zhongqiang Zhang
报告时间:2020年12月17日 10:00—12:30
报告地点:腾讯会议 846821131
报告摘要: We propose an abstract framework for analyzing the convergence of least-squares methods based on residual minimization when feasible solutions are neural networks. With the norm relations and compactness arguments, we derive error estimates for both continuous and discrete formulations of residual minimization in strong and weak forms. The formulations cover recently developed physics-informed neural networks based on strong and variational formulations. This is a joint work with Yeonjong Shin and George Em Karniadakis at Brown University. The full text of our work can be found athttps://arxiv.org/abs/2010.08019
报告人简介:张中强教授, 现就职于伍斯特理工学院(美国)数学科学系。2011年上海大学数学系取得计算数学博士, 师从马和平教授。 2014年1月获得布朗大学应用数学系博士学位,并获得David Gottlieb 毕业奖。 2014年7月起任教于伍斯特理工学院数学科学系。主要研究兴趣包括积微分方程数值解,计算概率和优化,以及机器学习的计算理论等。在各类著名计算数学期刊SINUM, NM, SISC, JCP等发表多篇论文。以第一作者与Professor George Em Karniadakiszhe合著Numerical methods for stochastic partial differential equations with white noise 一书。