|Speaker:||Yin Li (Peng Cheng Laboratory)|
|Title:||Forward Model the Universe in the Era of Deep Learning|
|Date (JST):||Tue, Sep 19, 2023, 11:00 - 12:00|
Rapid advances in deep learning have brought not only myriad powerful neural network models, but also breakthroughs that benefit established scientific research. In particular, automatic differentiation (AD) tools and computational accelerators like GPUs have facilitated forward modeling of the Universe with differentiable simulations. Using backpropagation, current differentiable cosmological simulations are limited by memory, thus are subject to a trade-off between time and space/mass resolution, usually sacrificing both. We present a new approach free of such constraints, using the adjoint method and reverse time integration. Being both memory and computation efficient, it enables larger and more accurate forward modeling at the field level, and will improve inference of fundamental physics from cosmological data in both precision and efficiency. Its differentiability opens up the
possibility for data driven workflows, incorporating trainable neural networks in differentiable physical models. We implement it in an open-source particle-mesh (PM) N-body library pmwd (particle-mesh with derivatives). Based on the powerful AD system JAX, pmwd is fully differentiable, and is highly performant on GPUs. Improvements are ongoing and planned to make it a unified framework to model and infer from cosmological data at the field level.
|Remarks:||Lecture Hall + Zoom|