This is a past event.
Abstract: In this talk, we present our work on adaptive multiresolution sparse grid DG method and its open source C++ package. This method is constructed based on two types of multiwavelets, and are demonstrated to be effective in adaptive calculations, particularly for high dimensional applications. Numerical results for Hamilton-Jacobi equations, nonlinear Schrodinger equations and wave equations will be discussed. We will also illustrate the main structure and feature of the open source C++ package.
Bio: Juntao Huang is an assistant professor at Texas Tech University. He obtained the Ph.D. degree in Applied Math in 2018 and the bachelor degree in 2013 from Tsinghua University. Prior to joining Texas Tech University in 2022, he worked as a visiting assistant professor at Michigan State University. His current research interests focus on the design and analysis of numerical methods for PDEs and, more recently, using machine learning to assist traditional scientific computing tasks. Topics of special interests include adaptive sparse grid discontinuous Galerkin (DG) methods, structure-preserving machine learning moment closures for kinetic models, structure-preserving time discretizations for hyperbolic equations, and boundary schemes for the lattice Boltzmann method.