Predicting Physics in Mesh-Reduced Space with Temporal Attention

December 13, 2022
9:00 am ET
Cummings 280
Speaker: Xu Han
Host: Liping Liu

Abstract

Quals talk:

Graph-based next-step prediction models have recently been very successful in modeling complex high-dimensional physical systems on irregular meshes. However, due to their short temporal attention span, these models suffer from error accumulation and drift. In this presentation, I will talk about a new method that captures long-term dependencies through a transformer-style temporal attention model. I will introduce an encoder-decoder structure to summarize features and create a compact mesh representation of the system state, to allow the temporal model to operate on a low-dimensional mesh representations in a memory efficient manner. Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks, from sonic shocks to vascular flow. We demonstrate stable rollouts without the need for training noise and show perfectly phase-stable predictions even for very long sequences. More broadly, I will show that our approach paves the way to bringing the benefits of attention- based sequence models to solving high-dimensional complex physics tasks.

Please join meeting in Cummings 280.

Zoom is not available for this event; please disregard dial-in passcode included in email.