12.–14. Nov. 2025
SoN
Europe/Berlin Zeitzone

Self-learning Monte Carlo method with equivariant transformer

13.11.2025, 15:00
30m
SoN

SoN

Talk Plenary

Sprecher

Yuki Nagai (The University of Tokyo)

Beschreibung

Machine learning and deep learning have revolutionized computational physics, particularly in the simulation of complex systems. Equivariance plays a crucial role in modeling physical systems, as it enforces symmetry constraints that act as strong inductive biases on the learned probability distributions. However, incorporating such symmetries into models can sometimes lead to low acceptance rates in self-learning Monte Carlo (SLMC) methods.

In this work, we introduce a symmetry-equivariant attention mechanism for SLMC that can be systematically improved. We evaluate our architecture on the spin–fermion (double-exchange) model on a two-dimensional lattice. Our results demonstrate that the proposed method overcomes the poor acceptance rates observed in linear models and exhibits a scaling law analogous to that of large language models, with model quality improving monotonically with the number of layers [1]. This work paves the way toward more accurate and efficient Monte Carlo algorithms powered by machine learning for simulating complex physical systems.

[1] Y. Nagai and A. Tomiya, J. Phys. Soc. Jpn. 93, 114007 (2024).

Präsentationsmaterialien