This is a past event.
Title: Transformer × Finite Element Methods
Abstract: Transformer in "Attention Is All You Need" is now the ubiquitous architecture in every state-of-the-art model in Natural Language Processing (NLP), Computer Vision (CV), and scientific
breakthroughs such as AlphaFold 2. At its heart and soul is the "attention mechanism", we shall learn the mathematical structure of the attention for the first time via the lens of the operator approximation theory in Hilbert spaces. Inspired by finite element methods, one of the most widely used tools in science and engineering, the attention mechanism can be further modified, and to combine with the other operator learner to have a generational leap of its performance in various PDE-related operator learning tasks. In the later part of the talk, we will learn some new results involving boundary value inverse problems, which further strengthen the aspect that the a priori mathematical knowledge could facilitate a more structure-conforming design of neural networks.
Bio: Shuhao Cao is an assistant professor of Mathematics in the Division of Computing, Analytics, and Mathematics, School of Science and Engineering at UMKC. Shuhao received his Ph.D. in Mathematics working on finite element methods. His areas of expertise are scientific computing, computational methods for partial differential equations, scientific machine learning, and multilevel methods. Recently, Shuhao has been dedicated in combining numerical methods under a Hilbertian setup with the newest neural network architecture Transformer. Shuhao also codes a lot and contributes to open-source scientific computing and machine learning packages (https://github.com/scaomath).