Projects with September 2019 start

Theory: The Effective Field Theory Pathway to New Physics at the LHC

A very promising framework to parametrise in a robust and model-independent way deviations from the Standard Model (SM) induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, Beyond the SM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. In this project, we aim to carry out a global analysis of the SMEFT from high-precision LHC data, including Higgs boson production, flavour observables, and low-energy measurements. This analysis will be carried out in the context of the recently developed SMEFiT approach [1] based on Machine Learning techniques to efficiently explore the complex theory parameter space. The ultimate goal is either to uncover glimpses of new particles or interactions at the LHC, or to derive the most stringent model-independent bounds to date on general theories of New Physics.

Theory: Pinning down the initial state of heavy-ion collisions with Machine Learning

It has been known for more than three decades that the parton distribution functions (PDFs) of nucleons bound within heavy nuclei are modified with respect to their free-nucleon counterparts. Despite active experimental and theoretical investigations, the underlying mechanisms that drive these in-medium modifications of nucleon substructure have yet to be fully understood. The determination of nuclear PDFs is a topic of high relevance in order both to improve our fundamental understanding of the strong interactions in the nuclear environment, as well as and for the interpretation of heavy ion collisions at RHIC and the LHC, in particular for the characterization of the Quark-Gluon Plasma. The goal of this project is to exploit Machine Learning and Artificial Intelligence tools [1,2] (neural networks trained by stochastic gradient descent) to pin down the initial state of heavy ion collisions by using recent measurements from proton-lead collisions at the LHC. Emphasis will be put on the poorly-known nuclear modifications of the gluon PDFs, which are still mostly terra incognita and highly relevant for phenomenological applications. In addition to theory calculations, the project will also involve code development using modern AI/ML tools such as TensorFlow and Keras.

Theory: The High-Energy Muon Crisis and Perturbative QCD

The production of charmed meson from the collision of high-energy cosmic rays with air nucleons in the upper atmosphere provides an important component of the flux of high-energy muons and neutrinos that can be detected at cosmic ray experiments such as AUGER and neutrino telescopes such as KM3NET or IceCube. The production of forward muons from charmed meson decays is usually predicted from QCD models tuned to the data, rather than from first principles QCD calculation. Interestingly, the number of such high-energy muons observed by AUGER seems to differ markedly from current theory predictions. In this project we aim to exploit state-of-the-art perturbative and non-perturbative QCD techniques to compute the flux of high-energy muons from charm decays and make predictions for a number of experiments sensitive to them

Dark Matter: XENON1T Data Analysis

The XENON collaboration has used the XENON1T detector to achieve the world’s most sensitive direct detection dark matter results and is currently building the XENONnT successor experiment. The detectors operate at the Gran Sasso underground laboratory and consist of so-called dual-phase xenon time-projection chambers filled with ultra-pure xenon. Our group has an opening for a motivated MSc student to do analysis with the data from the XENON1T detector. The work will consist of understanding the detector signals and applying machine learning tools such as deep neutral networks to improve the reconstruction performance in our Python-based analysis tool, following the approach described in arXiv:1804.09641. The final goal is to improve the energy and position reconstruction uncertainties for the dark matter search. There will also be opportunity to do data-taking shifts at the Gran Sasso underground laboratory in Italy.

Dark Matter: XAMS R&D Setup

The Amsterdam Dark Matter group operates an R&D xenon detector at Nikhef. The detector is a dual-phase xenon time-projection chamber and contains about 4kg of ultra-pure liquid xenon. We plan to use this detector for the development of new detection techniques (such as utilizing new photosensors) and to improve the understanding of the response of liquid xenon to various forms of radiation. The results could be directly used in the XENON experiment, the world’s most sensitive direct detection dark matter experiment at the Gran Sasso underground laboratory. We have several interesting projects for this facility. We are looking for someone who is interested in working in a laboratory on high-tech equipment, modifying the detector, taking data and analyzing the data him/herself. You will "own" this experiment.

Dark Matter: DARWIN Sensitivity Studies

DARWIN is the "ultimate" direct detection dark matter experiment, with the goal to reach the so-called "neutrino floor", when neutrinos become a hard-to-reduce background. The large and exquisitely clean xenon mass will allow DARWIN to also be sensitive to other physics signals such as solar neutrinos, double-beta decay from Xe-136, axions and axion-like particles etc. While the experiment will only start in 2025, we are in the midst of optimizing the experiment, which is driven by simulations. We have an opening for a student to work on the GEANT4 Monte Carlo simulations for DARWIN, as part of a simulation team together with the University of Freiburg and Zurich. We are also working on a "fast simulation" that could be included in this framework. It is your opportunity to steer the optimization of a large and unique experiment. This project requires good programming skills (Python and C++) and data analysis/physics interpretation skills.

The Modulation Experiment: Data Analysis

There exist a few measurements that suggest an annual modulation in the activity of radioactive sources. With a few groups from the XENON collaboration we have developed four sets of table-top experiments to investigate this effect on a few well known radioactive sources. The experiments are under construction in Purdue University (USA), a mountain top in Switzerland, a beach in Rio de Janeiro and the last one at Nikhef in Amsterdam. We urgently need a master student to (1) analyze the first big data set, and (2) contribute to the first physics paper from the experiment. We are looking for all-round physicists with interest in both lab-work and data-analysis. The student(s) will directly collaborate with the other groups in this small collaboration (around 10 people), and the goal is to have the first physics publication ready by the end of the project. During the 2018-2019 season there are positions for two MSc students.

ATLAS: The lifetime of the Higgs boson

While the Higgs boson was discovered in 2012, many of its properties still remain unconstrained. This master student project revolves around one such property, the lifetime of the Higgs boson. The lifetime can be obtained by measuring the width of the boson, but because the width is a few hundred times smaller than the detector resolution, a direct measurement is impossible at the moment. But there is an idea to overcome that limitation. By utilizing the interference between the Higgs boson decay and background processes we can perform an indirect measurement. This measurement potentially has the sensitivity that will allow us to perform a measurement of the width (or lifetime) as predicted by the Standard Model. Specifically, the master project will be about predicting the sensitivity of this measurement for different predictions of the Higgs width. The project is on the interface of theory and experiment, making use of Monte Carlo generators and the standard HEP analysis tools (ROOT, C++, python).

ATLAS: The Next Generation

After the observation of the coupling of Higgs bosons to fermions of the third generation, the search for the coupling to fermions of the second generation is one of the next priorities for research at CERN's Large Hadron Collider. The search for the decay of the Higgs boson to two charm quarks is very new [1] and we see various opportunities for interesting developments. For this project we propose improvements in reconstruction (using exclusive decays), advanced analysis techiques (using deep learning methods) and expanding the new physics models (e.g. including a search for off-diagonal H->uc couplings). Another opportunity would be the development of the first statistical combination of results between the ATLAS and CMS experiment, which could significantly improve the discovery potentional.

ATLAS: The Most Energetic Higgs Boson

The production of Higgs bosons at the highest energies could give the first indications for deviations from the standard model of particle physics, but production energies above 500 GeV have not been observed yet [1]. The LHC Run-2 dataset, collected during the last 4 years, might be the first opportunity to observe such processes, and we have various ideas for new studies. Possible developments include the improvement of boosted reconstruction techniques, for example using multivariate deep learning methods. Also, there are various opportunities for unexplored theory interpretations (using the MadGraph event generator), including effective field theory models (with novel ‘morphing’ techniques) and the study of the Higgs boson’s self coupling.

LHCb: Measurement of Central Exclusive Production Rates of Chi_c using converted photons in LHCb

Central exclusive production (CEP) of particles at the LHC is characterised by a extremely clean signature. Differently from the typical inelastic collisions where many particles are created resulting in a so-called Primary Vertex, CEP events have only the final state particles of interest. In this project the particle of interest is a pair of charmed quarks creating a chi_c particle. In theory this process is generated by a long range gluon exchange and can elucidate the nature of the strong force, described by the quantum chromodynamics in the the standard model. The proposed work involves analysing a pre-existing dataset with reconstructed chi_c and simulating events at the LHCb in order to obtain the relative occurrence rate of each chi_c species (spins 0, 1, 2), a quantity that can be easily compared to theoretical predictions.

LHCb: Optimization studies for Vertex detector at the High Lumi LHCb

The LHCb experiment is dedicated to measure tiny differences between matter and antimatter through the precise study of rare processes involving b or c quarks. The LHCb detector will undergo a major modification in order to dramatically increase the luminosity and be able to measure indirect effects of physics beyond the standard model. In this environment, over 42 simultaneous collisions are expected to happen at a time interval of 200 ps where the two proton bunches overlap. The particles of interest have a relatively long lifetime and therefore the best way to distinguish them from the background collisions is through the precise reconstruction of displaced vertices and pointing directions. The new detector considers using extremely recent or even future technologies to measure space (with resolutions below 10 um) and time (100 ps or better) to efficiently reconstruct the events of interest for physics. The project involves changing completely the LHCb Vertex Locator (VELO) design in simulation and determine what can be the best performance for the upgraded detector, considering different spatial and temporal resolutions.

During the R&D phase for the LHCb VELO Upgrade detector a few sensor prototypes were irradiated to the extreme fluence expected to be achieved during the detector lifetime. These samples were tested using high energy particles at the SPS facility at CERN with their trajectories reconstructed by the Timepix3 telescope. A preliminary analysis revealed that at the highest irradiation levels the amount of signal observed is higher than expected, and even larger than the signal obtained at lower doses. At the Device Under Test (DUT) position inside the telescope, the spatial resolution attained by this system is below 2 um. This means that a detailed analysis can be performed in order to study where and how this signal amplification happens within the 55x55 um^2 pixel cell. This project involves analysing the telescope and DUT data to investigate the charge multiplication mechanism at the microscopic level.

Detector R&D: Studying fast timing detectors

Fast timing detectors are the solution for future tracking detectors. In future LHC operation conditions and future colliders, more and more particles are produced per collision. The high particle densities make it increasingly more difficult to separate particle trajectories with the spatial information that current silicon tracking detectors provide. A solution would be to add very precise (in order of 10ps) timestamps to the spatial measurements of the particle trackers. A good understanding of the performance of fast timing detectors is necessary. With the user of a pulsed laser in the lab we study the characteristics of several prototype detectors.

KM3NeT: Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first few strings with sensitive photodetectors have been deployed at both the Italian and the French detector sites. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data together with simulations to optimally identify and reconstruct the first neutrino interactions in the KM3NeT detector (applying also machine learning for background suppression) and with this pave the path towards accurate neutrino oscillation measurements and neutrino astronomy.

The study of the cosmic neutrinos of energies above 1017 eV, the so-called ultra-high
energy neutrinos, provides a unique view on the universe and may provide insight
in the origin of the most violent astrophysical sources, such as gamma ray bursts,
supernovae or even dark matter. In addition, the observation of high energy neutrinos
may provide a unique tool to study interactions at high energies.
The energy deposition of these extreme neutrinos in water induce a thermo-
acoustic signal, which can be detected using sensitive hydrophones. The expected
neutrino flux is however extremely low and the signal that neutrinos induce is small.
TNO is presently developing sensitive hydrophone technology based on fiber optics.
Optical fibers form a natural way to create a distributed sensing system. Using this
technology a large scale neutrino telescope can be built in the deep sea. TNO is aiming
for a prototype hydrophone which will form the building block of a future telescope.

While the KM3NeT neutrino telescope is being constructed in
the deep waters of the Mediterranean Sea,
data from its precursor (Antares) have been accumulated for more than 10 years.
The main objective of these neutrino telescopes is to determine the origin of (cosmic) neutrinos.
The accuracy of the determination of the origin of neutrinos critically depends on
the probability density function (PDF) of the arrival time of Cherenkov light
produced by relativistic charged particles emerging from a neutrino interaction in the sea.
It has been shown that these PDFs can be calculated from first principles and
that the obtained values can efficiently be interpolated in 4 and 5 dimensions,
without compromising the functional dependencies.
The reconstruction software based on this input yields indeed for KM3NeT the best resolution.
This project is aimed at applying the KM3NeT software to available Antares data.

HiSPARC: Extensive Air Shower Reconstruction using Machine Learning

An important aspect of high energy cosmic ray research is the reconstruction of the direction and energy of the primary cosmic ray. This is done by measuring the footprint of the extensive air shower initiated by the cosmic ray. The goal of this project is to advance the creation of a reconstruction algorithm based on machine learning (ML) techniques.

A previous master student has made great progress in the creation of a ML algorithm for the direction reconstruction. The algorithm was trained on simulations and applied to real data. The method works quite well but we expect that better results can be achieved by improving the simulated data set. In this project you will implement a more accurate description of the photomultiplier tube in the simulation pipeline and check if the reconstruction will improve. The next step would be to advance the algorithm towards energy reconstruction. This means upscaling the current method and will involve the creation and manipulation of large simulated data sets.

The HiSPARC group is small. As a student you can have a big impact and there is freedom to tailor your own project. The proposed project is for students with a particular interest in computational (astro)physics. Advanced programming skills (mainly Python) and Linux knowledge are desirable.

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

VU LaserLab: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in
the following papers:

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:

Performing calculations of hyperfine structures

As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
during August - Dec 2018 while on sabbatical.

Projects with September 2018 start

Theory: Stress-testing the Standard Model at the high-energy frontier

A suitable framework to parametrise in a model-independent way deviations from the SM induced by new heavy particles is the Standard Model Effective Field Theory (SMEFT). In this formalism, bSM effects are encapsulated in higher-dimensional operators constructed from SM fields respecting their symmetry properties. Here we aim to perform a global analysis of the SMEFT from high-precision LHC data. This will be achieved by extending the NNPDF fitting framework to constrain the SMEFT coefficients, with the ultimate aim of identifying possible bSM signals.

Theory: The quark and gluon internal structure of heavy nuclei in the LHC era

A precise knowledge of the parton distribution functions (PDFs) of the proton is essential in order to make predictions for the Standard Model and beyond at hadron colliders. The presence of nuclear medium and collective phenomena which involve several nucleons modifies the parton distribution functions of nuclei (nPDFs) compared to those of a free nucleon. These modifications have been investigated by different groups using global analyses of high energy nuclear reaction world data. It is important to determine the nPDFs not only for establishing perturbative QCD factorisation in nuclei but also for applications to heavy-ion physics and neutrino physics. In this project the student will join an ongoing effort towards the determination of a data-driven model of nPDFs, and will learn how to construct tailored Artificial Neural Networks (ANNs).

The formation of hadrons from quarks and gluons, or collectively partons, is a fundamental QCD process that has yet to be fully understood. Since parton-to-hadron fragmentation occurs over long-distance scales, such information can only be extracted from experimental observables that identify mesons and baryons in the final state. Recent progress has been made to determine these fragmentation functions (FFs) from charged pion and kaon production in single inclusive e+e−-annihilation (SIA) and additionally pp-collisions and semi-inclusive deep inelastic scattering (SIDIS). However, charged hadron production in unpolarized pp and inelastic lepton-proton scattering also require information about the momentum distributions of the quarks and gluons in the proton, which is encoded in non-perturbative parton distribution functions (PDFs). In this project, a simultaneous treatment of both PDFs and FFs in a global QCD analysis of single inclusive hadron production processes will be made to determine the individual parton-to-hadron FFs. Furthermore, a robust statistical methodology with an artificial neural network learning algorithm will be used to obtain a precise estimation of the FF uncertainties. This work will emphasis in particular the impact of pp-collision and SIDIS data on the gluon and separated quark/anti-quark FFs, respectively.

ALICE: Charm is in the Quark Gluon Plasma

The goal of heavy-ion physics is to study the Quark Gluon Plasma (QGP), a hot and dense medium where quarks and gluons move freely over large distances, larger than the typical size of a hadron. Hydrodynamic simulations expect that the QGP will expand under its own pressure, and cool while expanding. These simulations are particularly successful in describing some of the key observables measured experimentally, such as particle spectra and various orders of flow harmonics. Charm quarks are produced very early during the evolution of a heavy-ion collision and can thus serve as an idea probe of the properties of the QGP. The goal of the project is to study higher order flow harmonics (e.g. triangular flow - v3) that are more sensitive to the transport properties of the QGP for charm-mesons, such as D0, D*, Ds. This will be the first ever measurement of this kind.

ALICE: Probing the time evolution of particle production in the Quark-Gluon Plasma

Particle production is governed by conservation laws, such as local charge conservation. The latter ensures that each charged particle is balanced by an oppositely-charged partner, created at the same location in space and time. The charge-dependent angular correlations, traditionally studied with the balance function, have emerged as a powerful tool to probe the properties of the Quark-Gluon Plasma (QGP) created in high energy collisions. The goal of this project is to take full advantage of the unique, among all LHC experiments, capabilities of the ALICE detector that is able to identify particles to extend the studies to different particle species (e.g. pions, kaons, protons…). These studies are highly anticipated by both the experimental and theoretical communities.

ALICE: CP violating effects in QCD: looking for the chiral magnetic effect with ALICE at the LHC

Within the Standard Model, symmetries, such as the combination of charge conjugation (C) and parity (P), known as CP-symmetry, are considered to be key principles of particle physics. The violation of the CP-invariance can be accommodated within the Standard Model in the weak and the strong interactions, however it has only been confirmed experimentally in the former. Theory predicts that in heavy-ion collisions gluonic fields create domains where the parity symmetry is locally violated. This manifests itself in a charge-dependent asymmetry in the production of particles relative to the reaction plane, which is called Chiral Magnetic Effect (CME). The first experimental results from STAR (RHIC) and ALICE (LHC) are consistent with the expectations from the CME, but background effects have not yet been properly disentangled. In this project you will develop and test new observables of the CME, trying to understand and discriminate the background sources that affects such a measurement.

LHCb: Searching for dark matter in exotic six-quark particles

3/4 of the mass in the Universe is of unknown type. Many hypotheses about this dark matter have been proposed, but none confirmed. Recently it has been proposed that it could be made of particles made of the six quarks uuddss. Such a particle could be produced in decays of heavy baryons. It is proposed to use Xi_b baryons produced at LHCb to search for such a state. The latter would appear as missing 4-momentum in a kinematically constrained decay. The project consists in optimising a selection and applying it to LHCb data. See arXiv:1708.08951

LHCb: Measurement of BR(B0 → Ds+ Ds-)

This project aims to discover the branching fraction of the decay B0->Ds- Ds+. The decay B0->Ds- Ds+ is quite rare, because it occurs through the exchange of a W-boson between the b and the d-quark of the B0-meson. This decay proceeds via Cabibbo-suppressed W-exchange and has not yet been observed; theoretical calculations predict a branching fraction at the order of 10^-5 with a best experimental upper limit of 3.6x10^-5.
A measurement of the decay rate of B0 -> Ds+Ds- relative to that of B0 -> D+D- can provide an estimate of the W-exchange contribution to the latter decay, a crucial piece of information for extracting the CKM angle gamma from B0 -> D(*)D(*).
The aim is to determine the relative branching fraction of B0->Ds+Ds- with respect to B0->Ds+D- decays (which has the best known branching ratio at present, (7.2 +- 0.8)x10^-3), in close collaboration with the PhD. The aim is that this project results in a journal publication on behalf of the LHCb collaboration. For this project computer skills are needed. The ROOT programme and C++ and/or Python macros are used. This is a project that is closely related to previous analyses in the group. Weekly video meetings with CERN coordinate the efforts with in the LHCb collaboration.
Relevant information:
[1] M.Jung and S.Schacht, "Standard Model Predictions and New Physics Sensitivity in B -> DD Decays" https://arxiv.org/pdf/1410.8396.pdf
[2] L.Bel, K.de Bruyn, R. Fleischer, M.Mulder, N.Tuning, "Anatomy of B -> DD Decays" https://arxiv.org/pdf/1505.01361.pdf
[3] A.Zupanc et al [Belle Collaboration] "Improved measurement of B0 -> DsD+ and search for B0 -> Ds+Ds at Belle" https://arxiv.org/pdf/hep-ex/0703040.pdf
[4] B.Aubert et al. [Babar Collaboration] "Search for the W-exchange decays B0 -> DD+" https://arxiv.org/pdf/hep-ex/0510051.pdf
[5] R.Aaij et al. [LHCb Collaboration], "First observations of B0s -> D+D, Ds+D and D0D0 decays" https://arxiv.org/pdf/1302.5854.pdf

Virgo: Fast determination of gravitational wave properties

In the era of multi-messenger astronomy, the development of fast, accurate and computationally cheap methods for inference of properties of gravitational wave signal is of paramount importance. In this work, we will work on the development of rapid bayesian parameter estimation method for binary neutron stars as well as precessing black hole binaries. Bayesian parameter estimation methods require the evaluation of a likelihood that describe the probability of obtaining data for a given set of model parameters, which are parameters of gravitational wave signals in this particular problem. Bayesian inference for gravitational wave parameter estimation may require millions of these evaluation making them computationally costly. This work will combine the benefits of machine learning/ deep learning methods and order reduction methods of gravitational wave source modelling to speed up Bayesian inference of gravitational waves.

With the detection of the binary neutron star merger in August 2017 (GW170817) a new era of multi-messenger astronomy started. GW170817 proved that neutron star mergers are ideal laboratories to constrain the equation of state of cold supranuclear matter, to study the central engines of short GRBs, and to understand the origin and production of heavy elements.
The fundamental tool to understand the last stages of the binary dynamics are numerical relativity simulations. In this project the student will be introduced to the basics of numerical relativity simulations of binary neutron star simulations and will be able to perform simulations on its own. Based on these simulations and the first experience it will be possible to focus on one of the following aspects:

- the estimation of the ejected material released from the merger and the development of models for the electromagnetic signals

Gravitational wave observation of the binary neutron star merger GW170817 with its coincident optical counterpart led to a first "standard siren" measurement of the Hubble parameter independent of the cosmological distance ladder. While multiple similar observations are expected to improve the precision of the measurement, a statistical method of cross correlation with galaxy catalogues of gravitational-wave distance estimates is expected to work even without identified electromagnetic transients, and for binary black hole mergers in particular. The project would primarily be a study of various systematic effects in this analysis and correcting for them. The work will involve use of computational techniques to analyze LIGO-Virgo data. Some prior experience of programmimg is expected.

When a conventional X-ray image is made to analyse the composition of a sample, or to perform a medical examination on a patient, one acquires an image that only shows intensities. One obtains a ‘black and white’ image. Most of the information carried by the photon energy is lost. Lacking spectral information can result in an ambiguity between material composition and amount of material in the sample. If the X-ray intensity as a function of the energy can be measured (i.e. a ‘colour’ X-ray image) more information can be obtained from a sample. This translates to less required dose and/or to a better understanding of the sample that is being investigated. For example, two fields that can benefit from spectral X-ray imaging are mammography and real time CT.

X-ray detectors based on Medipix/Timepix pixel chips have spectral resolving capabilities and can be used to make polychromatic X-ray images. Medipix and Timepix chips have branched from pixel chips developed for detectors for high energy physics collider experiments.

Activities in the field of (spectral) CT scans are performed in a collaboration between two institutes (Nikhef and CWI) and two companies (ASI and XRE).

Detector R&D: Compton camera

In the Nikhef R&D group we develop instrumentation for particle physics but we also investigate how particle physics detectors can be used for different purposes. A successful development is the Medipix chip that can be used in X-ray imaging. For use in large scale medical applications compton scattering limits however the energy resolving possibilities. You will investigate whether it is in principle possible to design a X-ray application that detects the compton scattered electron and the absorbed photon. Your ideas can be tested in practice in the lab where a X-ray scan can be performed.

Detector R&D: Holographic projector

A difficulty in generating holograms (based on the interference of light) is the required dense pixel pitch. One would need a pixel pitch of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling. A pixel pitch of 200 nanometer is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power that would be required to control such a dense pixel matrix.

A new holographic projection method has been developed that reduces under sampling artefacts for projectors with a ‘low’ pixel density. It is using 'pixels' at random but known positions, resulting in an array of (coherent) light points that lacks (or has strongly surpressed) spatial periodicity. As a result a holographic projector can be built with a significantly lower pixel density and correspondingly less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc..

Of course, nothing comes for free: With less pixels, holograms become noisier and the contrast will be reduced. The big question: How do we determine the requirements (in terms of pixel density, pixel positioning, etc..) for the holographic projector based on requirements for the holograms?
Requirements for a hologram can be expressed in terms of: Noise, contrast, resolution, suppression of under sampling artefacts, etc..

For this project we are building a proof of concept holographic emitter. This set-up will be used to verify simulation results (and also to project some cool holograms of course).

Students can do hands on lab-work (building and testing the proto type projector) and/or work on setting up simulation methods and models. Simulations in this field can be highly parallelized and are preferably written for parallel computing and/or GPU computing.

Detector R&D: Laser Interferometer Space Antenna (LISA)

The space-based gravitational wave antenna LISA is without doubt one of the most challenging space missions ever proposed. ESA plans to launch around 2030 three spacecrafts that are separated by a few million kilometers to measure tiny variations in the distances between test-masses located in each spacecraft to detect the gravitational waves from sources such as supermassive black holes. The triangular constellation of the LISA mission is dynamic requiring a constant fine tuning related to the pointing of the laser links between the spacecrafts and a simultaneous refocusing of the telescope. The noise sources related to the laser links are expected to provide a dominant contribution to the LISA performance.

An update and extension of the LISA science simulation software is needed to assess the hardware development for LISA at Nikhef, TNO and SRON. A position is therefore available for a master student to study the impact of instrumental noise on the performance of LISA. Realistic simulations based on hardware (noise) characterization measurements that were done at TNO will be carried out and compared to the expected tantalizing gravitational wave sources.

KM3NeT : Reconstruction of first neutrino interactions in KM3NeT

The neutrino telescope KM3NeT is under construction in the Mediterranean Sea aiming to detect cosmic neutrinos. Its first two strings with sensitive photodetectors have been deployed 2015&2016. Already these few strings provide for the option to reconstruct in the detector the abundant muons stemming from interactions of cosmic rays with the atmosphere and to identify neutrino interactions. In order to identify neutrinos an accurate reconstruction and optimal understanding of the backgrounds are crucial. In this project we will use the available data to identify and reconstruct the first neutrino interactions in the KM3NeT detector and with this pave the path towards neutrino astronomy.

ANTARES: Analysis of IceCube neutrino sources.

The only evidence for high energetic neutrinos from cosmic sources so far comes from detections with the IceCube detector. Most of the detected events were reconstructed with a large uncertainty on their direction, which has prevented an association to astrophysical sources. Only for the high energetic muon neutrino candidates a high resolution in the direction has been achieved, but also for those no significant correlation to astrophysical sources has to date been detected.
The ANTARES neutrino telescope has since 2007 continuously taken neutrino data with high angular resolution, which can be exploited to further scrutinize the locations of these neutrino sources. In this project we will address the neutrino sources in a stacked analysis to further probe the origin of the neutrinos with enhanced sensitivity.

In collaboration with Nikhef and the Van Swinderen Institute for Particle Physics and Gravity at the University of Groningen, we have recently started an exciting project to measure the electric dipole moment (EDM) of the electron in cold beams of barium-fluoride molecules. The eEDM, which is predicted by the Standard Model of particle physics to be extremely small, is a powerful probe to explore physics beyond this Standard Model. All extensions to the Standard Model, most prominently supersymmetry, naturally predict an electron EDM that is just below the current experimental limits. We aim to improve on the best current measurement by at least an order of magnitude. To do so we will perform a precision measurement on a slow beam of laser-cooled BaF molecules. With this low-energy precision experiment, we test physics at energies comparable to those of LHC!

At LaserLaB VU, we are responsible for building and testing a cryogenic source of BaF molecules. The main parts of this source are currently being constructed in the workshop. We are looking for enthusiastic master students to help setup the laser system that will be used to detect BaF. Furthermore, projects are available to perform simulations of trajectory simulations to design a lens system that guides the BaF molecules from the exit of the cryogenic source to the experiment.

VU LaserLab: Physics beyond the Standard model from molecules

Our team, with a number of staff members (Ubachs, Eikema, Salumbides, Bethlem, Koelemeij) focuses on precision measurements in the hydrogen molecule, and its isotopomers. The work aims at testing the QED calculations of energy levels in H2, D2, T2, HD, etc. with the most precise measurements, where all kind of experimental laser techniques play a role (cw and pulsed lasers, atomic clocks, frequency combs, molecular beams). Also a target of studies is the connection to the "Proton size puzzle", which may be solved through studies in the hydrogen molecular isotopes.

In the past half year we have produced a number of important results that are described in
the following papers:

These five results mark the various directions we are pursuing, and in all directions we aim at obtaining improvements. Specific projects with students can be defined; those are mostly experimental, although there might be some theoretical tasks, like:

Performing calculations of hyperfine structures

As for the theory there might also be an international connection for specifically bright theory students: we collaborate closely with prof. Krzystof Pachucki; we might find an opportunity
for a student to perform (the best !) QED calculations in molecules, when working in Warsaw and partly in Amsterdam. Prof Frederic Merkt from the ETH Zurich, an expert in the field, will come to work with us on "hydrogen"
during August - Dec 2018 while on sabbatical.