Renormalization-Inspired Effective Field Neural Networks for Scalable Modeling of Classical and Quantum Many-Body Systems

Xi Liu, Yujun Zhao, Chun Yu Wan, Yang Zhang, Junwei Liu

We introduce Effective Field Neural Networks (EFNNs), a new architecture based on continued functions -- mathematical tools used in renormalization to handle divergent perturbative series. Our key insight is that neural networks can implement these continued functions directly, providing a principled approach to many-body interactions. Testing on three systems (a classical 3-spin infinite- range model, a continuous classical Heisenberg spin system, and a quantum double exchange model), we find that EFNN outperforms standard deep networks, ResNet, and DenseNet. Most striking is EFNN's generalization: trained on $10 \times 10$ lattices, it accurately predicts behavior on systems up to $40\times 40$ with no additional training -- and the accuracy improves with system size, with a computational time speed-up of $10^{3}$ compared to ED for $40\times 40$ lattice. This demonstrates that EFNN captures the underlying physics rather than merely fitting data, making it valuable beyond many-body problems to any field where renormalization ideas apply.

picture_as_pdf flag

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment