In this video from PASC18, Maurizio Pierini from CERN presents: Generative Models for Application-Specific Fast Simulation of LHC Collision Events.

“We investigate the possibility of using generative models (e.g., GANs and variational autoencoders) as analysis-specific data augmentation tools to increase the size of the simulation data used by the LHC experiments. With the LHC entering its high-luminosity phase in 2025, the projected computing resources will not be able to sustain the demand for simulated events. Generative models are already investigated as the mean to speed up the centralized simulation process. Here we propose to investigate a different strategy: training deep networks to generate small-dimension ntuples of numbers (physics quantities such as reconstructed particle energy and direction), learning the distribution of these quantities from a sample of simulated data. In one step, one would then be able to generate the outcome of the full processing workflow (generation + simulation + reconstruction + selection).”

Maurizio Pierini is the coordinator of the Physics Performances and Dataset (PPD) area at CERN. “Our task is to provide CMS with the detector conditions (alignment and calibrations), assure the quality of our data (online and offline data quality monitoring and certification), and to validate the performances of the software for generation, simulation, reconstruction, and physics-object definition. It is our task to coordinate the Monte Carlo production and the dataset (re)processing, including the definition of primary datasets, and skims for physics and detector studies. Being responsible of the datasets, we also take care of data-flow related applications like hotline, event display, and scouting.”

Resource Links:

Latest Video

Industry Perspectives

Often, it’s not enough to parallelize and vectorize an application to get the best performance. You also need to take a deep dive into how the application is accessing memory to find and eliminate bottlenecks in the code that could ultimately be limiting performance. Intel Advisor, a component of both Intel Parallel Studio XE and Intel System Studio, can help you identify and diagnose memory performance issues, and suggest strategies to improve the efficiency of your code. [READ MORE…]

White Papers

The financial services and insurance sector is one of the most data-intensive industries in modern business. Unfortunately, that abundance of information has hindered the extraction of business value from data. However, improvements in technology can take data-related challenges that had, until recently, been considered impossible to overcome. Download the new white paper from Penguin Computing that highlights how financial services and insurance firms can benefit from GPU computing and spur innovation and future technological developments.