SC17 Preview – Back to the Future!

By Dairsie Latimer

October 24, 2017

I’m really looking forward to returning to Denver this year and I’ll admit that I have a real soft spot for Blue Bear. As a regular SC attendee (and at times exhibitor) over the last twelve years or more (ok my memory that far back is a little hazy) my interest levels, and perhaps my cynicism levels are starting to build as I write this. Key areas of interest are discussed with colleagues; meeting arrangements are starting to be firmed up and promises to catch up with old friends made. All the usual build up for our industry’s headline event.

So what will the unofficial theme be at SC17?

Well there’s the Top500 list of course. While Summit and Sierra, two of the US exascale pathfinder systems (part of the CORAL initiative) are being installed they probably won’t make a showing this November. China will look to pack out the Top 5 with more home-grown technology with the upgraded Tianhe-2A (and probably hoping to be the first past the sustained 100 PF post). The Chinese exascale programme is taking shape, but like everyone else’s there are real challenges involved in delivering effective systems which don’t require a new coal fired power station to be built next door.

We may find out a little more about what the recast Aurora programme at Argonne will look like (though I doubt how much will be spoken about it, even behind closed doors) but the delayed gestation shows that there are real question marks about what this new incarnation will actually look like when (and if) it arrives in 2021.

I think we’ll be hearing a lot more about ARM and HPC this year. Hopefully we will even see real live tin running on the exhibition floor with CPUs from Cavium and Qualcomm, especially after some of the noise generated at ISC17 earlier in the year. It may be a bit early to hear from production users at the ARM user group session this SC but the time is finally approaching.

There seems to be a bit more noise about quantum computing this year, with Intel now getting in on the act, but while the strategic potential for certain verticals is there, I think we’re still early on in the technology adoption curve that most new innovations follow, so for now it’s still hype over practical substance. If someone can show me a widely adoptable programming model for quantum computing I’ll listen with interest.

SC17 also looks like it will be the show where architectures aimed squarely at machine learning come to the show floor. NVIDIA has been blazing a trail for the ML and DL for a while now (with Google and Microsoft also getting in on the act) but the recent public announcement from Intel of their Nervana Neural Network Processor (NNP) and Graphcore’s Intelligence Processing Unit (IPU) architecture we will see some of the other early players squaring off. It remains to be seen how their unique approaches to solving problems in the ML space will work in practice, but this has all the makings of a humdinger of a cage fight.

There’s another thread running through our world these days and it is the at least ideological convergence of HPC and big data (and to that you can add visualisation). Perhaps it’s always been there, but now are we seeing a deliberate convergence of hardware and software platforms for HPC and big data? Is this due to financial necessity or simply because there are real synergies to be found? Do we even have the right tools for the job? I hope to find some answers to this question at SC17 – vendors you have been warned.

Personally I think the theme for SC17 is going to be the re-emergence of notable technological innovation in the HPC space. It feels a little like the Cambrian explosion, where every technological niche is about to undergo an intense period of competition and once the evolutionary dust has settled again we will go back to a period of relative stability. The question for users will be how to cope with significant architectural changes which will affect code structure while still sustaining code maintainability and extracting significant benefits from the new technologies? Oh did I mention that’s part of the theme of one of the panels at SC this year?

As usual there’s lots going on and at SC17 and Red Oak (@redoakHPC) are busier than ever before helping delivering three half day tutorials over the course of two days as well as organising a panel on software sustainability. Check out our website and blog for more details.

I look forward to seeing you there!

About the Author

Dairsie has a somewhat eclectic background, having worked in a variety of roles on supplier side and client side across the commercial and public sectors as a consultant and software engineer. Following an early career in computer graphics, micro-architecture design and full stack software development; he has over twelve years’ specialist experience in the HPC sector, ranging from developing low-level libraries and software for novel computing architectures to porting complex HPC applications to a range of accelerators. He also advises clients on strategy, technology futures, HPC procurements and managing challenging technical projects.

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement on Monday (Jan. 14 Read more…

By Tiffany Trader

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

Previous:

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By James Reinders

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By John Russell

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By John Russell

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By John Russell

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…