TOP500

ISC HPC Blog

Reaching the Exascale is rightly posed as a combination of challenges related to (i) energy efficiency, (ii) heterogeneity and resiliency of computation, storage and communication, and (iii) the scale of the parallelism involved.

For many decades, if you belonged to the “chosen people,” supercomputers were on the floor of your HPC center – mighty, powerful, thundering, admired, the Holy Grail of science and engineering. And you were one of the few elected HPC-savvy; you were the magician conjuring grand challenge solutions out of this magic box on the floor.

As a theoretical physicist, I am fascinated by “fundamental” laws of physics describing basic phenomena of nature. Many of these laws are tested through numerous experiments and are “valid” within the constraints defined by these experiments. In this sense we “believe”, for example, in Newton's law of gravity.

If some proposed legislation in the US becomes the law of the land, supercomputers funded by the National Science Foundation (NSF) might have a rather different set of workloads in the not-too-distant future. In fact, there might be less need for these supercomputers, altogether.