Santiago Nuñez-Corrales, a graduate researcher at NCSA and Ph.D. student in Informatics at the University of Illinois at Urbana-Champaign was one of twelve recipients to be awarded the SIGHPC/Intel Computational and Data Science Fellowship for 2017. “The fellowship brings with it so many benefits,” said Nuñez-Corrales. “Attending SC17 is important to keep up with development in the HPC community, and the financial support is critical to continuing my degree, as I have no funding from my home country.”

In this video, researchers describe how the new HPC facility at Rockefeller University will power bioinformatics research and more. This is the first time that Rockefeller University has purpose-built a datacenter for high performance computing.

Researchers at RENCI are using xDCI Data CyberInfrastructure to manage brain microscopy images that were overwhelming the storage capacity at individual workstations. “BRAIN-I is a computational infrastructure for handling these huge images combined with a discovery environment where scientists can run applications and do their analysis,” explained Mike Conway, a senior data science researcher at RENCI. “BRAIN-I deals with big data and computation in a user-friendly way so scientists can concentrate on their science.”

Kathy Yelick from LBNL will give the HPC keynote on Exascale computing at the upcoming ACM Europe Conference. With main themes centering on Cybersecurity and High Performance Computing, the event takes place Sept. 7-8 in Barcelona.

Employing a hybrid of MPI across nodes in a cluster, multithreading with OpenMP* on each node, and vectorization of loops within each thread results in multiple performance gains. In fact, most application codes will run slower on the latest supercomputers if they run purely sequentially. This means that adding multithreading and vectorization to applications is now essential for running efficiently on the latest architectures.

In this video, Scott Harrison from Rock Flow Dynamics and Bruno Franzini from NICE Software explain how they scale HPC workloads in the cloud. “You’ll learn how they leverage Amazon EC2 Spot instances and Amazon S3 to create cost-effective, scalable clusters that power tNavigator, Rock Flow Dynamics’ solution for running dynamic reservoir simulations. You’ll also see how they use NICE Software’s DCV to stream the OpenGL-based user interface to interact with 3D models.”

Illinois Rocstar is seeking a Sr. Scientific Programmer in our Job of the Week. “Have you ever thought that you’d like to write code to model something awesome? Like rockets? Aircraft? Explosions? Advanced battery performance? That’s what we do at Illinois Rocstar! Producing tools, interfaces, and systems to run on machines ranging from high-end engineering workstations to the most advanced high performance computing platforms in the world, we are all about making High Performance Computational Science and Engineering better and easier to use.”

Professor Philip Diamond, Director General of the international Square Kilometer Array (SKA) project, will be the keynote speaker at SC17. “Professor Diamond, accompanied by Dr. Rosie Bolton, SKA Regional Centre Project Scientist, will take SC17 attendees around the globe and out into the deepest reaches of the observable universe as they describe the SKA’s international partnership that will map and study the entire sky in greater detail than ever before.”

Today Dell EMC announced it will build a supercomputer to power Swinburne University of Technology’s groundbreaking research into astrophysics and gravitational waves. “We will be looking for gravitational waves that help us learn more about supernovas, the formation of stars, intergalactic gases and more,” said Professor Bailes. “It’s exciting to think that we as OzGRav could make the next landmark discovery in gravitational wave astrophysics – and the Dell EMC supercomputer will allow us to capture, visualise and process the data to make those discoveries.”

In this video from Evan Schneider at Princeton University, hot galactic winds destroy a cool cloud of interstellar gas. “The process of generating galactic winds is something that requires exquisite resolution over a large volume to understand—much better resolution than other cosmological simulations that model populations of galaxies,” Robertson said. “This is something you really need a machine like Titan to do.”

Latest Video

Industry Perspectives

In this video, researchers describe how the new HPC facility at Rockefeller University will power bioinformatics research and more. This is the first time that Rockefeller University has purpose-built a datacenter for high performance computing. [Read More...]