Research

Structured Representation Learning Lab (SRL Lab)

PI:Yue SONG

Research Direction: Structured Representation Learning for Scientific Intelligence: Bidirectional Synergy between AI and Science

Research Group Introduction

Our lab is dedicated to advancing structured representation learning at the frontier of science and artificial intelligence. Our core methodology involves extracting universal informational priors and constraints -- drawn from structured principles inherent in natural sciences such as physics and neuroscience (e.g., matrix geometry, dynamical systems, oscillatory synchronization) -- and transforming them into computable and embeddable inductive biases. Building on these principles, we develop next-generation AI models that exhibit physical consistency, semantic disentanglement, and transformation equivariance (Science4AI), with the aim of significantly enhancing their interpretability, generalization capability, and data efficiency. Ultimately, we systematically apply these models to address critical challenges in complex scientific discovery (AI4Science), including modeling and predicting multi-scale dynamical systems, solving high-dimensional inverse problems, and performing generation and reasoning on non-Euclidean structured data (such as fluid fields, biological motion trajectories, and material microstructures), thereby contributing to the establishment of a new paradigm in scientific intelligence.

Representative Publications

• Yue Song, T. Anderson Keller, Nicu Sebe, and Max Welling. Structured Representation Learning: From Homomorphisms and Disentanglement to Equivariance and Topography. Springer Nature. 2025.

• Yue Song, T. Anderson Keller, Nicu Sebe, and Max Welling. “Flow Factorized Representation Learning”. NeurIPS. 2023.

• Yue Song, Nicu Sebe, and Wei Wang. “Fast Differentiable Matrix Square Root”. ICLR. 2022.

• Yue Song, T. Anderson Keller, Sevan Brodjian, Takeru Miyato, Yisong Yue, Pietro Perona, and Max Welling. “Kuramoto Orientation Diffusion Models”. NeurIPS. 2025.

• Yue Song, T. Anderson Keller, Nicu Sebe, and Max Welling. “Latent Traversals in Generative Models as Potential Flows”. ICML. 2023.

• Yue Song, T. Anderson Keller, Yisong Yue, Pietro Perona, and Max Welling. “Langevin Flows for Modeling Neural Latent Dynamics”. Cognitive Computational Neuroscience (CCN). 2025.

• Yue Song, Nicu Sebe, and Wei Wang. “Rankfeat: Rank-1 feature removal for out-of-distribution detection”. NeurIPS. 2022.

• Ziheng Chen, Yue Song*, Yunmei Liu, and Nicu Sebe. “A Lie Group Approach to Riemannian Batch Normalization”. ICLR. 2024. (*denotes corresponding author)

• Guanghao Wei, Yining Huang, Chenru Duan, Yue Song†, and Yuanqi Du†. “Navigating Chemical Space with Latent Flows”. NeurIPS. 2024. († denotes co-supervision).

• Raphi Kang, Yue Song, Georgia Gkioxari, and Pietro Perona. “Is CLIP ideal? No. Can we fix it? Yes!” ICCV. 2025.

Group Members

News & Updates

TOP