My research focuses on deep learning theory, PDE learning, numerical PDEs and image processing.
Deep learning theory: I develope approximation theories and statistical learning theories of deep neural networks on various problems, especially when data have some low-dimensional structures.
PDE learning: I design efficient and robust algorithms for PDE learning from noisy data sets.
Numerical PDEs: I focus on using the level set method and operator-splitting method to solve various problems and nonlinear PDEs. My recent works proposed operator-splitting method based numerical solvers for the Monge-Ampère type equations.
Image processing: I design image regularization models and efficient algorithms by operator-splitting methods.
Exploiting Data Low-Dimensional Structures by Deep Neural Networks with Applications on Learning Operators.
Workshop on Inverse Problems and Image Processing, Hohhot, Aug. 2024.
Exploiting Data Low-Dimensional Structures by Deep Neural Networks with Applications on Learning Operators.
Conference of Scientific Machine Learning, Shanghai Jiao Tong University, Shanghai, Aug. 2024.
Exploiting Data Low-Dimensional Structures by Deep Neural Networks with Applications on Learning Operators.
International Conference on Scientific Computation and Differential Equations, National University of Singapore, Singapore, Jul. 2024.
A Mathematical Explanation of Encoder-Decoder Based Neural Networks.
SIAM Conference on Imaging Science, Atlanta, May 2024.
A Mathematical Explanation of Encoder-Decoder Based Neural Networks.
SIAM Conference on Imaging Science, Atlanta, May 2024.
Theories for Learning Functions and Operators with Low-Dimensional Structures by Deep Neural Networks.
The Hong Kong-Taiwan Joint Conference on Applied Mathematics and Related Topics (HTJC2024), Chung Cheng University, Chiayi, May 2024.
Deep Learning Theories for Problems with Low-Dimensional Structures.
SIAM Conference on Applied Linear Algebra (LA24),, Sorbonne Université, Paris, May 2024.
Elastica Models for Color Image Regularization.
Workshop on Inverse Problems and Image Processing, Beijing Normal University, Beijing, Dec. 2023.
Deep Learning Theories for Problems with Low-Dimensional Structures.
Seminar, Beijing Normal University, Zhuhai, Nov. 2023.
Deep Learning Theories for Problems with Low-Dimensional Structures.
ICIAM, Tokyo,, Aug. 2023.
Deep Learning Theories for Problems with Low-Dimensional Structures.
Seminar, Tianjin Normal University, Tianjin, Jun. 2023.
Elastica Models for Color Image Regularization.
Seminar, Nankai University, Tianjin, Jun. 2023.
Robust PDE Identification from a Noisy Data Set.
Workshop on Inverse Problems and Imaging, Harbin Institute of Technology Shenzhen, Shenzhen, Apr. 2023.
Robust PDE Identification from a Noisy Data Set.
The 18th IEEE-EMBS International Summer School and Symposium on Medical Devices and Biosensors/The 14th International School and Symposium on Biomedical and Health Engineering, Hong Kong Centre for Cerebro-Cardiovascular Health Engineering
(COCHE) (online), Oct. 2022.
Deep Learning Theories for Problems with Low-Dimensional Structures.
SIAM Conference on Mathematics of Data Science, online, Sep. 2020.
Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint.
ICML, online, Jul. 2022.
Robust PDE Identification from Noisy Data Sets.
SIAM Annual Meeting, online, Jul. 2022.
Deep Learning Theories for Problems with Low-Dimensional Structures.
Seminar, Tinajin University, online, Jul. 2022.
Elastica Models for Color Image Regularization.
Workshop on Recent Advances in Image Processing, CUHK(Shenzhen), online, Apr. 2022.
Off-policy learning and classification on low-dimensional manifold by deep neural networks.
Mathematical Data Science Seminar, Purdue University, online, Feb. 2022.
Besov Function Approximation and Binary Classification on Low-Dimensional Manifolds Using Convolutional Residual Networks..
ICML, onine, Jul. 2021.
Learning functions varying along an active subspace.
SIAM Student Seminar, Georgia Institute of Technology, Feb. 2020.
Approximate functions varying along an active subspace.
Workshop on New Trends in Machine Learning and Numerical PDEs, Hong Kong Baptist University, Dec. 2019.
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
Scientific Computing Seminars, University of Houston, Nov. 2018.
Poster Presentation
Learning functions varying along an active subspace.
2020 Georgia Scientific Computing Symposium, Emory University, Feb. 2020.
Approximate functions varying along an active subspace.
Workshop on Recent Developments on Mathematical/Statistical approaches in DAta Science (MSDAS),
The University of Texas at Dallas, May. 2019.
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
2019 Georgia Scientific Computing Symposium, Georgia Institute of Technology, Feb. 2019.
A level set based variational principal flow method for nonparametric dimension reduction on Riemannian manifolds.
Meeting the Statistical Challenges in High Dimensional Data and Complex Networks,
National University of Singapore, Feb. 2018.