The Exascale challenge

Monday 11th September 2017

Who's the slickest at exascaling?

Challenges arisen from climate change, manufacturing, genomics, cosmology, artificial intelligence, big data and the internet of things are expected to be solved by the next generation of exascale supercomputers Exascale is an artificial milestone figure represented by the number one followed by eighteen zeros. Exascale computers will be able to perform a quintillion calculations per second, and they are expected to consume only 20 MW of power. Renewable energy will play a paramount role in the future of improving the energy efficiency of supercomputers. This is one of the challenges scientists face.

“The reason why it’s so difficult to build these computers is the amount of power they need,” says Catalin Ciobanu from the University of Amsterdam, in the Netherlands, who works as the leader for the EXTRA (Exploiting Exascale Technology with Reconfigurable Architectures) project. It is about building the groundwork for exascale supercomputers, and creating a framework that can be used by HPC hardware nodes of the future.

According to Ciobanu, supercomputers need the same amount of power as a whole country, or a very big city. The problem is more about creating a solution that is efficient and economical enough, for not consuming large amounts of power while running high performance computers. The project is part of the EU Future and Emerging Technologies - High-Performance Computing programme (H2020 FET-HPC) that is developing the next generation of HPC technologies and making existing software utilise supercomputers better.

“We also have to think about the industrial, not only the scientific research. For major industries such as automotive, avionics, and pharmaceutical it’s extremely important to compete in high-performance computing,” says (top left) Mario Kovač, from the University of Zagreb, who works in the FET project MANGO (Manycore Architectures for Next Generation HPC systems) that test new, specially heterogeneous, architectures for the exascale supercomputers. “We are also focusing on an efficient architecture for liquid cooling,” adds Kovač.

“What you will see is an expansion in the utility of supercomputers. New applications are emerging all the time,” says George Beckett (left)who together with (right) Nick Brown - both from the University of Edinburgh, in Scotland - work in the INTERTWinE project (Interoperability Toward Exascale). They look at how the large repository of existing scientific software will be able to work on computers that are larger than the systems we have today, and investigate the problem of interoperability of programming models for the exascale.

According to Beckett, supercomputing - that was traditionally more focused on physical sciences and engineering - is now picked up by mathematicians, pharmacologists, web lab sciences, the softer sciences, and technology driven industries.

Experimentation in many areas of science today is being replaced by simulation. Virtual worlds inside a computer run scientific experiments within that virtual world. “The complexity and challenges in science today are impractical in terms of scale, cost, and speed. Computers offer a new form of laboratory wherein to do experiments, a virtual environment where you control the size of the universe you are simulating,” Beckett tells youris. “A more detailed universe means a more complex computing problem to solve, which requires a bigger computer. By having more powerful computers we have more opportunities to push our understanding in these science areas.”

“The EU has invested heavily in research and development (R&D) in a supercomputing programme, which is focused on exascale. The current amount of the investment is around 700 million euros. There is a roadmap toward producing exascale systems in the early 2020s,” says Beckett. And Europe’s approach is to build it in a way that is sympathetic to the needs of the environment, and the economics.

The H2020 FET-HPC programme inspired new media visual artists Špela Petrič and Miha Turšič, who transformed inputs by algorithms into works of artistic value, which can be better understood by the general public. Becoming.A (Thing), created under the Future and Emerging Art and Technologies (FEAT) project, is the result of their interpretation. It will be exhibited at the Bozar Centre for Fine Arts in Brussels, Belgium, from 14th to 30th September 2017.