主讲人:鞠立力 南卡罗莱纳大学教授
时间:2023年6月20日10:00
地点:三号楼301室
举办单位:数理学院
主讲人介绍:鞠立力教授1995年毕业于武汉大学数学系获数学学士学位,1998年在中国科学院计算数学与科学工程计算研究所获得计算数学硕士学位,2002年在美国爱荷华州立大学获得应用数学博士学位。2002-2004年在美国明尼苏达大学数学与应用研究所从事博士后研究。随后进入美国南卡罗莱纳大学工作,历任数学系助理教授(2004-2008),副教授(2008-2012),和教授(2013-现在)。主要从事偏微分方程数值方法与分析,非局部模型与算法,计算机视觉,深度学习算法,高性能科学计算,及其在材料与地球科学中的应用等方面的研究工作。至今已发表科研论文140多篇,Google学术引用5000多次。自2006年起连续主持了十多项由美国国家科学基金会和能源部资助的科研项目。2012至2017年担任SIAM Journal on Numerical Analysis的副编辑,目前是JSC, NMPDE, NMTMA, AAMM等期刊的副编辑。与合作者关于合金微结构演化在“神威·太湖之光”超级计算机上的相场模拟工作入围2016年国际高性能计算应用领域“戈登·贝尔”奖提名。
内容介绍:Inspired by the Nonlinear Level set Learning (NLL) method that uses the reversible residual network (RevNet), we propose a new method of Dimension Reduction via Learning Level Sets (DRiLLS) for function approximation. Our method contains two major components: one is the pseudo-reversible neural network (PRNN) module that effectively transforms high-dimensional input variables to low-dimensional active variables, and the other is the synthesized regression module for approximating function values based on the transformed data in the low-dimensional space. The PRNN not only relaxes the invertibility constraint of the nonlinear transformation present in the NLL method due to the use of RevNet, but also adaptively weights the influence of each sample and controls the sensitivity of the function to the learned active variables. The synthesized regression uses Euclidean distance in the input space to select neighboring samples, whose projections on the space of active variables are used to perform local least-squares polynomial fitting. This helps to resolve numerical oscillation issues present in traditional local and global regressions. Extensive experimental results demonstrate that our DRiLLS method outperforms both the NLL and Active Subspace methods, especially when the target function possesses critical points in the interior of its input domain.