December 9, 2014 - In the recent review article published in International Journal of Modern Physics B, Prof. Nikolic and collaborators - Prof. Freericks from Physics and Prof. Frieder from Computer Science Departments at Georgetown University - have proposed new routes to tackle the challenge posed by quantum many-body systems far from equilibrium encountered in condensed matter physics, AMO physics, plasma physics and cosmology. The review serves as a sort of manifesto that could mobilize physics, computer science and applied mathematics communities to work together.

Conventional forms of big data science and data mining involve the creation of large finite data sets and the analysis of that data to produce insight and to predict future behavior of the system. Many physics problems are being analyzed in this big data paradigm, such as high-energy-physics scattering experiments, which routinely create petabytes of data. Other fields of science also involve big data science, with large datasets being analyzed in fields as diverse as weather forecasting, genomics, and neuroscience. Many algorithms and data analysis tools are used to interact with such datasets; often they are created specifically for a given data set, even if they employ more general purpose algorithms and strategies for the data analysis.

However, some problems are too large to be treated within big data science paradigm. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve them more accurately and for longer times.