Research
Research Interests
Mathematical foundations of deep learning
Applied probability and stochastic dynamics
Applied and numerical analysis for PDEs
Bayesian and computational statistics
Inverse problems and uncertainty quantification
More information on my research can be found in my Google Scholar profile.
My research is currently funded by the grant NSF DMS-2343135 and NSF DMS-2436333
Preprints
Wall laws for viscous flows in 3D randomly rough pipes: optimal convergence rates and stochastic integrability
In collaboration with M. Higaki and J. Zhuge
submitted [arxiv]
Provable In-Context Learning of Linear Systems and Linear Elliptic PDEs with Transformers
In collaboration with F. Cole, Riley O'Neil and Tianhao Zhang
Accelerating Langevin sampling with birth-death
In collaboration with J. Lu and J. Nolen.
Published Journal Papers
Generative downscaling of PDE solvers with physics-guided diffusion models
In collaboration with W. Xu
Journal of Scientific Computing 101 (2024) [arXiv]
Fully discretized Sobolev gradient flow for the Gross-Pitaevskii eigenvalue problem
In collaboration with Z. Chen, J. Lu, X. Zhang
Accepted by Mathematics of Computation [arXiv]
Optimal Deep Neural Network Approximation for Korobov Functions with respect to Sobolev Norms
In collaboration with Y. Yang
Neural Networks 180 (2024) [arXiv]
On the convergence of Sobolev gradient flow for the Gross-Pitaevskii eigenvalue problem
In collaboration with Z. Chen, J. Lu, X. Zhang
SIAM J. Numer. Anal. 62 (2023) [arXiv]
Birth-death dynamics for sampling: Global convergence, approximations and their asymptotics
In collaboration with Dejan Slepčev and Lihan Wang
Nonlinearity (2023) [arXiv]
A Regularity Theory for Static Schr\"odinger Equations on $\mathbb{R}^d$ in Spectral Barron Spaces
In collaboration with Z. Chen, J. Lu and Shengxuan Zhou
SIAM J. Math. Anal. 55 (2023) [arXiv]
Exponential-wrapped distributions on symmetric spaces
In collaboration with D. Li, E. Chevalier and D. Dunson
SIAM Journal on Mathematics of Data Science (SIMODS). 4 (2022) [arXiv]
Solving multiscale steady radiative transfer equation using neural networks with uniform stability
In collaboration with L. Wang and W. Xu
Research in the Mathematical Sciences [arXiv] [Code]
A Priori Generalization Error Analysis of Two-Layer Neural Networks for Solving High Dimensional Schrödinger Eigenvalue Problems
In collaboration with J. Lu
Communications of the American Mathematical Society (CAMS) [arXiv]
Continuum limit and preconditioned Langevin sampling of the path integral molecular dynamics
In collaboration with J. Lu and Z. Zhou
Journal of Computational Physics, 415 (2020) [Journal] [arXiv]
Quantitative propagation of chaos in the bimolecular reaction-diffusion model
In collaboration with T-S. Lim and J. Nolen.
SIAM J. Math. Anal. 52 (2020) [Journal] [arXiv]
Geometric ergodicity of Langevin dynamics with Coulomb interactions
In collaboration with J. C. Mattingly.
Nonlinearity. 33 (2019) [Journal] [arXiv]
Uniform-in-Time weak error analysis for stochastic gradient descent Algorithms via diffusion approximation
In collaboration with Y. Feng, T. Gao, L. Li and J-G. Liu.
Commun.Math.Sci. [Journal] [arXiv]
Exponential decay of R\'enyi divergence under Fokker-Planck equations
In collaboration with Y. Cao and J. Lu.
J. Stat. Phys. 176 (2019) [Journal] [arXiv]
An operator splitting scheme for the fractional kinetic Fokker-Planck equation,
In collaboration with M. H. Duong.
Discrete Contin. Dyn. Syst. Ser. A. 39 (2019) [Journal] [arXiv]
On the rate of convergence of empirical measures in $\infty$-Wasserstein distance for unbounded density function
In collaboration with A. Liu and J-G. Liu.
Quarterly of Applied Mathematics, 77 (2019) [Journal] [arxiv]
Gradient flow structure and exponential decay of the sandwiched Rényi divergence for primitive Lindblad equations with GNS-detailed balance
In collaboration with Y. Cao and J. Lu.
J. Math. Phys. 60(2019), 052202 (33 pages) [Journal] [arxiv]
Scaling limit of the Stein variational gradient descent: the mean field regime,
In collaboration with J. Lu and J. Nolen.
SIAM J. Math. Anal. 51(2019), 648-671. (24 pages) , [Journal] [arXiv]
Gaussian approximations for probability measures on $\R^d$,
In collaboration with A.M. Stuart and H. Weber.
SIAM/ASA J. Uncertainty Quantification. 5(1), 1136-1165. (2017). (30 pages) [Journal] [arXiv]
Gaussian approximation for transition paths in Brownian dynamics,
In collaboration with A.M. Stuart and H. Weber.
SIAM J. Math. Anal., 49(2017), 3005–3047. (43 pages) [Journal] [arXiv]
A Bayesian level set method for geometric inverse problems,
In collaboration with M.A. Iglesias and A.M. Stuart.
Interfaces and Free Boundaries, 18(2016), 181-217. (36 pages) [Journal] [arXiv]
The factorization method for inverse elastic scattering from periodic structures,
In collaboration with G. Hu and B. Zhang.
Inverse Problems, 29 (2013), 115005. (25 pages) [Journal]
Peer-Reviewed Conference Proceedings
Score-based generative models break the curse of dimensionality in learning a family of sub-Gaussian distributions
In collaboration with F. Cole
Two-scale gradient descent ascent dynamics finds mixed Nash equilibria of continuous games: A mean-field perspective
Transfer learning enhanced DeepONet for long-time prediction of evolution equations
In collaboration with L. Wang and W. Xu
accepted by The 37th AAAI conference on Artificial Intelligence (2023). [arXiv] [Code]
On the Representation of Solutions to Elliptic PDEs in Barron Spaces
In collaboration with Z. Chen and J. Lu
The 35th Conference on Neural Information Processing Systems (NeurIPS) (2021) (Spotlight presentation). [arXiv]
A Priori Generalization Analysis of the Deep Ritz Method for Solving High Dimensional Elliptic Equations
In collaboration with J. Lu and M. Wang
34th Annual Conference on Learning Theory (COLT 2021) [arXiv]
A Universal Approximation Theorem of Deep Neural Networks for Expressing Distributions
In collaboration with J. Lu
The 34th Conference on Neural Information Processing Systems (NeurIPS) (2020) [arXiv]
A Mean-field Analysis of Deep ResNet and Beyond: Towards Provable Optimization Via Overparameterization From Depth
In collaboration with Y. Lu , C. Ma, J. Lu and L. Ying
The 37th International Conference on Machine Learning (ICML) (2020) [arXiv]
Unpublished note
On the Bernstein-Von Mises Theorem for High Dimensional Nonlinear Bayesian Inverse Problems
Other publications
Asymptotic analysis and computations of probability measures, PhD thesis, University of Warwick, 2017
[Link]