In order to enable an iCal export link, your account needs to have an API key created. This key enables other applications to access data from within Indico even when you are neither using nor logged into the Indico system yourself with the link provided. Once created, you can manage your key at any time by going to 'My Profile' and looking under the tab entitled 'HTTP API'. Further information about HTTP API keys can be found in the Indico documentation.

I have read and understood the above.

Additionally to having an API key associated with your account, exporting private event information requires the usage of a persistent signature. This enables API URLs which do not expire after a few minutes so while the setting is active, anyone in possession of the link provided can access the information. Due to this, it is extremely important that you keep these links private and for your use only. If you think someone else may have acquired access to a link using this key in the future, you must immediately create a new key pair on the 'My Profile' page under the 'HTTP API' and update the iCalendar links afterwards.

Scientific Topic: In this workshop we aim to explore, and to encourage, the utilization of state-of-the-art machine learning algorithms for research in dark matter physics and astronomy. Our objective is to accelerate the identification of Dark Matter with a multidisciplinary approach: The researchers coming to this workshop bring together expertise in experimental and theoretical particle physics, astrophysics, astronomy, statistics and Machine Learning. The workshop is planned to be a Kickoff meeting to generate a new open research community. We plan a whitepaper, follow-up workshops, a webpage and a mailing list for the DM-ML community.

Workshop Format Lorentz Workshops@ Oort are scientific meetings for small groups of up to 60 participants, including both senior and junior scientists. Lorentz Center meetings dedicate a considerable amount of time to discussion sessions, thus stimulating an interactive atmosphere and encouraging collaborations between participants. This format typically generates extensive debates and enables significant progress to be made within the research topic of the meeting.

Strong gravitational microlensing (GM) events give us a possibility to determine some characteristics of both microlens and microlensed source. As the role of microlens can be played by a DM clump, GM can give us an important clue to understand the nature of dark matter on comparably small spatial/mass scales. In the same time, fitting the lightcurves of microlensed sources is quite time-consuming process, especially taking into account nonzero lens size. Here we test the possibility to apply the statistical machine learning techniques to distinguish high-amplification microlensing events (HAME) caused by continuously distributed DM clump from star- or black hole- induced microlensing (i.e. microlens is considered as a point-like mass). On this stage we use the set of simulated HAE amplification curves of sources microlensed by point masses and clumps of DM with various density profiles.

Emulsion-based detectors such as ones used for OPERA experiment or planned for SHiP and NEWS experiments may reveal important characteristics of WIMP-like particles. However due to the nature of the emulsion, the signal to noise ratio tend to be rather small and hence might require special reconstruction techniques. Thus advanced data analysis approaches based on machine learning approaches might improve «physical» sensitivity of the experiments. In this talk I’ll give brief overview of machine learning techniques that can be applied for dark matter searches in SHiP and NEWS experiments and present current challenges for those experiments both from physical and data analysis points of view.

Estimating the parameters of gravitational lenses with deep learning15m

Machine learning methods have seen a rapid expansion in the last few years. In particular, deep learning has made several breakthroughs, including beating a champion of game of Go and outperforming practicing dermatologists in the visual diagnosis of skin cancer. Although in most applications these networks have been used for classification tasks, they can also be made to predict real-valued model parameters. In this talk, I will discuss our results on using deep convolutional neural networks to estimate the parameters of strong gravitational lenses from telescope data. Estimating these parameters with traditional maximum-likelihood modeling methods is a time and resource consuming procedure, involving several data preparation steps and a difficult optimization process. With deep convolutional neural networks we are able to estimate these parameters in a fully automated way 10 million times faster than traditional modeling methods and with a similar accuracy. I will also discuss how to robustly quantify the uncertainties of these networks. This allows them to be a fast alternative to MCMC sampling. With the advent of large volumes of data from upcoming ground and space surveys and the remarkable speed offered by these networks, deep learning promises to become an indispensable tool for the analysis of large survey data.

Supersymmetry (SUSY) is able to solve the hierarchy problem and it can provide a perfect dark matter candidate. The non-observation of SUSY particles at the LHC and dark matter particles at dedicated experiments drives the SUSY particles to be heavier and heavier, which is assumed to make it more and more difficult for SUSY to solve the hierarchy problem as it gives rise to the need of fine-tuning of the input parameters of the theory. We are studying the allowed parameter space of several SUSY models. These models typically have a large number of parameters (10-30). We aim to find the set of allowed parameters that minimize the fine-tuning of these SUSY models. This is a resource-consuming process and we would like to discuss on how to do this more efficiently.

Although the standard model of particle physics is successful in describing physics as we know it, it is known to be incomplete. Many models have been developed to extend the standard model, none of which have been experimentally verified. One of the main hurdles in this effort is the dimensionality of these models, yielding problems in analysing, visualising and communicating results. Because of this, most current day analyses are done using simplified models, but in this process descriptive power is lost. However, by using machine learning on simulated model points, we show that we can overcome these problems and predict both binary exclusion an continuous likelihood in any parameter space. This simulated data will be stored in our new webbased database and model visualisation tool iDarkSurvey. This tool will be open to the scientific to store all calculated model data.

Speaker:
Bob Stienen
(Radboud University)

15:10
→
15:25

Using Deep Learning to predict Electroweakino production cross-sections at the LHC15m