Over at Scientific Computing, IDC’s Steve Conway writes that the rise of high-performance data analysis (HPDA) Big Data applications will require unprecedented memory and I/O capabilities. To address this market opportunity, vendors will need to return to their roots and architect more balanced supercomputing systems. The questions is: Have we seen the end of machines optimized for Linpack?

Storage and data movement can no longer remain secondary considerations. The goal of using available budgets to maximize peak and LINPACK flops (machoflops) will need to give way over time to a more singular focus on user requirements for sustained performance and time-to-solution. Already, one leading site, NCSA, has declined to submit high-performance Linpack numbers for the Top500 rankings. The Top500 list will remain valuable for census-tracking large systems and trends affecting them over time, but the shift away from strong compute centrism will make it even more important to develop more balanced benchmarks for HPC buyers/users.

Resource Links:

Latest Video

Industry Perspectives

In the second in this series of articles on ways to save energy in computing, Robert Roe from Scientific Computing World reports on how Europe is looking to both hardware and software solutions. [Read More...]