Projects with September 2019 start

The XENON Dark Matter Experiment: Data Analysis

The XENON collaboration is operating the XENON1T detector, the world’s most sensitive direct detection dark matter experiment. The Nikhef group is playing an important role in this experiment. The detector operates at the Gran Sasso underground laboratory and consists of a so-called dual-phase xenon time-projection chamber filled with 3200kg of ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from this detector. The work will consist of understanding the signals that come out of the detector and applying machine learning tools to improve the reconstruction performance in our Python-based analysis tool. The final goal is to improve the signal-to-background for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

The Modulation Experiment: Data Analysis

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for all-round physicists with interest in both lab-work and data-analysis. The student(s) will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project. During the 2018-2019 season there are positions for two MSc students.

Projects with September 2018 start

Theory: Stress-testing the Standard Model at the high-energy frontier

A suitable framework to parametrise in a model-independent way deviations from the SM induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, bSM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. Here we aim to perform a global analysis of the SMEFT from high-precision LHC data. This will be achieved by extending the NNPDF fitting framework to constrain the SMEFT coefficients, with the ultimate aim of identifying possible bSM signals.

Theory: The quark and gluon internal structure of heavy nuclei in the LHC era

A precise knowledge of the parton distribution functions (PDFs) of the proton is essential in order to make predictions for the Standard Model and beyond at hadron colliders. The presence of nuclear medium and collective phenomena which involve several nucleons modifies the parton distribution functions of nuclei (nPDFs) compared to those of a free nucleon. These modifications have been investigated by different groups using global analyses of high energy nuclear reaction world data. It is important to determine the nPDFs not only for establishing perturbative QCD factorisation in nuclei but also for applications to heavy-ion physics and neutrino physics. In this project the student will join an ongoing effort towards the determination of a data-driven model of nPDFs, and will learn how to construct tailored Artificial Neural Networks (ANNs).

The formation of hadrons from quarks and gluons, or collectively partons, is a fundamental QCD process that has yet to be fully understood. Since parton-to-hadron fragmentation occurs over long-distance scales, such information can only be extracted from experimental observables that identify mesons and baryons in the final state. Recent progress has been made to determine these fragmentation functions (FFs) from charged pion and kaon production in single inclusive e+e−-annihilation (SIA) and additionally pp-collisions and semi-inclusive deep inelastic scattering (SIDIS). However, charged hadron production in unpolarized pp and inelastic lepton-proton scattering also require information about the momentum distributions of the quarks and gluons in the proton, which is encoded in non-perturbative parton distribution functions (PDFs). In this project, a simultaneous treatment of both PDFs and FFs in a global QCD analysis of single inclusive hadron production processes will be made to determine the individual parton-to-hadron FFs. Furthermore, a robust statistical methodology with an artificial neural network learning algorithm will be used to obtain a precise estimation of the FF uncertainties. This work will emphasis in particular the impact of pp-collision and SIDIS data on the gluon and separated quark/anti-quark FFs, respectively.

ALICE: Charm is in the Quark Gluon Plasma

The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and various orders of flow harmonics. Charm quarks are produced very early during the evolution of a heavy-ion collision and can thus serve as an idea probe of the properties of the QGP. The goal of the project is to study higher order flow harmonics (e.g. triangular flow - v3) that are more sensitive to the transport properties of the QGP for charm-mesons, such as D0, D*, Ds. This will be the first ever measurement of this kind.

ALICE: Probing the time evolution of particle production in the Quark-Gluon Plasma

Particle production is governed by conservation laws, such as local charge conservation. The latter ensures that each charged particle is balanced by an oppositely-charged partner, created at the same location in space and time. The charge-dependent angular correlations, traditionally studied with the balance function, have emerged as a powerful tool to probe the properties of the Quark-Gluon Plasma (QGP) created in high energy collisions. The goal of this project is to take full advantage of the unique, among all LHC experiments, capabilities of the ALICE detector that is able to identify particles to extend the studies to different particle species (e.g. pions, kaons, protons…). These studies are highly anticipated by both the experimental and theoretical communities.

ALICE: CP violating effects in QCD: looking for the chiral magnetic effect with ALICE at the LHC

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.

ALICE: Particle polarisation in strong magnetic fields

When two atomic nuclei, moving in opposite directions, collide off- center then the Quark Gluon Plasma (QGP) created in the overlap zone is expected to rotate. The nucleons not participating in the collision represent electric currents generating an intense magnetic field. The magnetic field could be as large as 10^{18} gauss, orders of magnitude larger than the strongest magnetic fields found in astronomical objects. Proving the existence of the rotation and/or the magnetic field could be done by checking if particles with spin are aligned with the rotation axis or if charged particles have different production rates relative to the direction of the magnetic field. In particular, the longitudinal and transverse polarisation of the Lambda^0 baryon will be studied. This project requires some affinity with computer programming.

ATLAS : Excited lepton searches with multiple leptons

The Standard Model of particle physics (SM) is extremely successful, but would it hold against check with data containing multiple leptons? Although very rare process, the production of leptons is calculated in SM with high precision. On detector side the leptons (electrons and muons) are easy to reconstruct and such a sample contains very little "non-lepton" background. This analysis has an ambitious goal to find beyond Standard Model processes like Excited leptons using events with 4 leptons. With this project, the student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), with Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).

ATLAS : A search for lepton flavor violation with tau decays

Quarks mix, neutrinos mix, charged leptons do not mix. Why? Is that really how the nature works, or is it just a limitation in our detection techniques. ATLAS has recorded now a huge sample of data. Even such difficult final states as tau->3mu become accessible. However, the decays of charm and beauty mesons could spoil the picture with decays that resembles the signal. The goal of the project is to understand what
background decays are present and to find a way to suppress them. Success of project will allow much higher sensitivity to beyond Standard Model physics of tau->3mu. The student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), background suppression techniques and the standard HEP analysis tools (ROOT, C++, python).

ATLAS : A search for lepton non-universality in Bc meson decays

Recently, LHCb experiment has reported a number of intriguing deviations from SM in leptonic decays of B mesons. With this project we would like to probe if ATLAS also observes the same kind of deviation, e.g. in Bc->Jpsi+tau+nu channel w.r.t BC->Jpsi+mu+nu. Success of project will be essential to understand if we finally observe beyond SM process or if LHCb has some detector bias. The student would gain close familiarity with modern experimental techniques (statistical analysis, SM predictions, search for rare signals), background suppression techniques and the standard HEP analysis tools (ROOT, C++, python).

LHCb: Searching for dark matter in exotic six-quark particles

3/4 of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss. Such a particle could be produced in decays of heavy baryons. It is proposed to use Xi_b baryons produced at LHCb to search for such a state. The latter would appear as missing 4-momentum in a kinematically constrained decay. The project consists in optimising a selection and applying it to LHCb data. See arXiv:1708.08951

LHCb: Measurement of BR(B0 → Ds+ Ds-)

This project aims to discover the branching fraction of the decay B0->Ds- Ds+. The decay B0->Ds- Ds+ is quite rare, because it occurs through the exchange of a W-boson between the b and the d-quark of the B0-meson. This decay proceeds via Cabibbo-suppressed W-exchange and has not yet been observed; theoretical calculations predict a branching fraction at the order of 10^-5 with a best experimental upper limit of 3.6x10^-5.
A measurement of the decay rate of B0 -> Ds+Ds- relative to that of B0 -> D+D- can provide an estimate of the W-exchange contribution to the latter decay, a crucial piece of information for extracting the CKM angle gamma from B0 -> D(*)D(*).
The aim is to determine the relative branching fraction of B0->Ds+Ds- with respect to B0->Ds+D- decays (which has the best known branching ratio at present, (7.2 +- 0.8)x10^-3), in close collaboration with the PhD. The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.
Relevant information:
[1] M.Jung and S.Schacht, "Standard Model Predictions and New Physics Sensitivity in B -> DD Decays" https://arxiv.org/pdf/1410.8396.pdf
[2] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf
[3] A.Zupanc et al [Belle Collaboration] "Improved measurement of B0 -> DsD+ and search for B0 -> Ds+Ds at Belle" https://arxiv.org/pdf/hep-ex/0703040.pdf
[4] B.Aubert et al. [Babar Collaboration] "Search for the W-exchange decays B0 -> DD+" https://arxiv.org/pdf/hep-ex/0510051.pdf
[5] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf

Matched-filter searches for gravitational-wave signals from binary neutron stars, binary black holes and neutron-star-black-hole systems have been successful but many simplifications have been made. There are a number of avenues to explore for research, including expanding the parameter space to include precessing binaries or intermediate-mass black hole binaries, implementing multivariate statistics with analytic and machine learning techniques, and developing deeper searches by coordinating with gamma-ray triggers. These projects will include development work (python, C) and will be implemented in the upcoming Virgo/LIGO science runs, potentially leading to new discoveries and physics.

With the detection of the binary neutron star merger in August 2017 (GW170817) a new era of multi-messenger astronomy started. GW170817 proved that neutron star mergers are ideal laboratories to constrain the equation of state of cold supranuclear matter, to study the central engines of short GRBs, and to understand the origin and production of heavy elements.
The fundamental tool to understand the last stages of the binary dynamics are numerical relativity simulations. In this project the student will be introduced to the basics of numerical relativity simulations of binary neutron star simulations and will be able to perform simulations on its own. Based on these simulations and the first experience it will be possible to focus on one of the following aspects:

- the estimation of the ejected material released from the merger and the development of models for the electromagnetic signals

Gravitational wave observation of the binary neutron star merger GW170817 with its coincident optical counterpart led to a first "standard siren" measurement of the Hubble parameter independent of the cosmological distance ladder. While multiple similar observations are expected to improve the precision of the measurement, a statistical method of cross correlation with galaxy catalogues of gravitational-wave distance estimates is expected to work even without identified electromagnetic transients, and for binary black hole mergers in particular. The project would primarily be a study of various systematic effects in this analysis and correcting for them. The work will involve use of computational techniques to analyze LIGO-Virgo data. Some prior experience of programmimg is expected.

When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.

Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE).

Detector R&D: Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A successful development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.

Detector R&D: Holographic projector

A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It is using 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc..

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced. The big question: How do we determine the requirements (in terms of pixel density, pixel positioning, etc..) for the holographic projector based on requirements for the holograms?
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).

Students can do hands on lab-work (building and testing the proto type projector) and/or work on setting up simulation methods and models. Simulations in this field can be highly parallelized and are preferably written for parallel computing and/or GPU computing.

Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.

An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

KM3NeT : Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.

ANTARES: Analysis of IceCube neutrino sources.

The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected.
The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

VU LaserLab: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in
the following papers:

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:

Performing calculations of hyperfine structures

As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
during August - Dec 2018 while on sabbatical.