Abstract

Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.

abstract = "Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.",

N2 - Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.

AB - Obtaining an accurate ground state wave function is one of the great challenges in the quantum many-body problem. In this Letter, we propose a new class of wave functions, neural network backflow (NNB). The backflow approach, pioneered originally by Feynman and Cohen [Phys. Rev. 102, 1189 (1956)10.1103/PhysRev.102.1189], adds correlation to a mean-field ground state by transforming the single-particle orbitals in a configuration-dependent way. NNB uses a feed-forward neural network to learn the optimal transformation via variational Monte Carlo calculations. NNB directly dresses a mean-field state, can be systematically improved, and directly alters the sign structure of the wave function. It generalizes the standard backflow [L. F. Tocchio et al., Phys. Rev. B 78, 041101(R) (2008)10.1103/PhysRevB.78.041101], which we show how to explicitly represent as a NNB. We benchmark the NNB on Hubbard models at intermediate doping, finding that it significantly decreases the relative error, restores the symmetry of both observables and single-particle orbitals, and decreases the double-occupancy density. Finally, we illustrate interesting patterns in the weights and bias of the optimized neural network.