SummaryMaking accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.

Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.

Max ERC Funding

2 485 800 €

Duration

Start date: 2012-04-01, End date: 2017-03-31

Project acronymALUFIX

ProjectFriction stir processing based local damage mitigation and healing in aluminium alloys

Researcher (PI)Aude SIMAR

Host Institution (HI)UNIVERSITE CATHOLIQUE DE LOUVAIN

Call DetailsStarting Grant (StG), PE8, ERC-2016-STG

SummaryALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.

ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.

Max ERC Funding

1 497 447 €

Duration

Start date: 2017-01-01, End date: 2021-12-31

Project acronymBOSS-WAVES

ProjectBack-reaction Of Solar plaSma to WAVES

Researcher (PI)Tom VAN DOORSSELAERE

Host Institution (HI)KATHOLIEKE UNIVERSITEIT LEUVEN

Call DetailsConsolidator Grant (CoG), PE9, ERC-2016-COG

Summary"The solar coronal heating problem is a long-standing astrophysical problem. The slow DC (reconnection) heating models are well developed in detailed 3D numerical simulations. The fast AC (wave) heating mechanisms have traditionally been neglected since there were no wave observations.
Since 2007, we know that the solar atmosphere is filled with transverse waves, but still we have no adequate models (except for my own 1D analytical models) for their dissipation and plasma heating by these waves. We urgently need to know the contribution of these waves to the coronal heating problem.
In BOSS-WAVES, I will innovate the AC wave heating models by utilising novel 3D numerical simulations of propagating transverse waves. From previous results in my team, I know that the inclusion of the back-reaction of the solar plasma is crucial in understanding the energy dissipation: the wave heating leads to chromospheric evaporation and plasma mixing (by the Kelvin-Helmholtz instability).
BOSS-WAVES will bring the AC heating models to the same level of state-of-the-art DC heating models.
The high-risk, high-gain goals are (1) to create a coronal loop heated by waves, starting from an "empty" corona, by evaporating chromospheric material, and (2) to pioneer models for whole active regions heated by transverse waves."

"The solar coronal heating problem is a long-standing astrophysical problem. The slow DC (reconnection) heating models are well developed in detailed 3D numerical simulations. The fast AC (wave) heating mechanisms have traditionally been neglected since there were no wave observations.
Since 2007, we know that the solar atmosphere is filled with transverse waves, but still we have no adequate models (except for my own 1D analytical models) for their dissipation and plasma heating by these waves. We urgently need to know the contribution of these waves to the coronal heating problem.
In BOSS-WAVES, I will innovate the AC wave heating models by utilising novel 3D numerical simulations of propagating transverse waves. From previous results in my team, I know that the inclusion of the back-reaction of the solar plasma is crucial in understanding the energy dissipation: the wave heating leads to chromospheric evaporation and plasma mixing (by the Kelvin-Helmholtz instability).
BOSS-WAVES will bring the AC heating models to the same level of state-of-the-art DC heating models.
The high-risk, high-gain goals are (1) to create a coronal loop heated by waves, starting from an "empty" corona, by evaporating chromospheric material, and (2) to pioneer models for whole active regions heated by transverse waves."

Summary"Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."

"Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."

Max ERC Funding

1 191 440 €

Duration

Start date: 2011-12-01, End date: 2016-11-30

Project acronymCRASH

ProjectCRyptographic Algorithms and Secure Hardware

Researcher (PI)François-Xavier Standaert

Host Institution (HI)UNIVERSITE CATHOLIQUE DE LOUVAIN

Call DetailsStarting Grant (StG), PE6, ERC-2011-StG_20101014

SummarySide-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.

Side-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.

Max ERC Funding

1 498 874 €

Duration

Start date: 2011-10-01, End date: 2016-09-30

Project acronymDRY-2-DRY

ProjectDo droughts self-propagate and self-intensify?

Researcher (PI)Diego González Miralles

Host Institution (HI)UNIVERSITEIT GENT

Call DetailsStarting Grant (StG), PE10, ERC-2016-STG

SummaryDroughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.

Droughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.

Max ERC Funding

1 465 000 €

Duration

Start date: 2017-02-01, End date: 2022-01-31

Project acronymERQUAF

ProjectEntanglement and Renormalisation for Quantum Fields

Researcher (PI)Jutho Jan J HAEGEMAN

Host Institution (HI)UNIVERSITEIT GENT

Call DetailsStarting Grant (StG), PE2, ERC-2016-STG

SummaryOver the past fifteen years, the paradigm of quantum entanglement has revolutionised the understanding of strongly correlated lattice systems. Entanglement and closely related concepts originating from quantum information theory are optimally suited for quantifying and characterising quantum correlations and have therefore proven instrumental for the classification of the exotic phases discovered in condensed quantum matter. One groundbreaking development originating from this research is a novel class of variational many body wave functions known as tensor network states. Their explicit local structure and unique entanglement features make them very flexible and extremely powerful both as a numerical simulation method and as a theoretical tool.
The goal of this proposal is to lift this “entanglement methodology” into the realm of quantum field theory. In high energy physics, the widespread interest in entanglement has only been triggered recently due to the intriguing connections between entanglement and the structure of spacetime that arise in black hole physics and quantum gravity. During the past few years, direct continuum limits of various tensor network ansätze have been formulated. However, the application thereof is largely unexplored territory and holds promising potential. This proposal formulates several advancements and developments for the theoretical and computational study of continuous quantum systems, gauge theories and exotic quantum phases, but also for establishing the intricate relation between entanglement, renormalisation and geometry in the context of the holographic principle. Ultimately, these developments will radically alter the way in which to approach some of the most challenging questions in physics, ranging from the simulation of cold atom systems to non-equilibrium or high-density situations in quantum chromodynamics and the standard model.

Over the past fifteen years, the paradigm of quantum entanglement has revolutionised the understanding of strongly correlated lattice systems. Entanglement and closely related concepts originating from quantum information theory are optimally suited for quantifying and characterising quantum correlations and have therefore proven instrumental for the classification of the exotic phases discovered in condensed quantum matter. One groundbreaking development originating from this research is a novel class of variational many body wave functions known as tensor network states. Their explicit local structure and unique entanglement features make them very flexible and extremely powerful both as a numerical simulation method and as a theoretical tool.
The goal of this proposal is to lift this “entanglement methodology” into the realm of quantum field theory. In high energy physics, the widespread interest in entanglement has only been triggered recently due to the intriguing connections between entanglement and the structure of spacetime that arise in black hole physics and quantum gravity. During the past few years, direct continuum limits of various tensor network ansätze have been formulated. However, the application thereof is largely unexplored territory and holds promising potential. This proposal formulates several advancements and developments for the theoretical and computational study of continuous quantum systems, gauge theories and exotic quantum phases, but also for establishing the intricate relation between entanglement, renormalisation and geometry in the context of the holographic principle. Ultimately, these developments will radically alter the way in which to approach some of the most challenging questions in physics, ranging from the simulation of cold atom systems to non-equilibrium or high-density situations in quantum chromodynamics and the standard model.

Max ERC Funding

1 499 375 €

Duration

Start date: 2017-02-01, End date: 2022-01-31

Project acronymFLICs

ProjectEnabling flexible integrated circuits and applications

Researcher (PI)Kris Jef Ria Myny

Host Institution (HI)INTERUNIVERSITAIR MICRO-ELECTRONICA CENTRUM

Call DetailsStarting Grant (StG), PE7, ERC-2016-STG

SummaryThin-film transistor technologies are present in many products today that require an active transistor backplane e.g. flat-panel displays and flat-panel photodetector arrays. Unipolar n-type transistors based on amorphous Indium-Gallium-Zinc-Oxide (a-IGZO) as semiconductor is currently the most promising technology for next generation products demanding a high-performant, low power transistor, manufacturable on flexible substrates enabling curved, bendable and even rollable displays. a-IGZO is a wide bandgap material characterized by extremely low off-state leakage currents and electron mobility of ~20 cm2/Vs. IGZO transistors fabricated on flexible substrates will also find their use in applications that require flexible integrated circuits.
The goal of this FLICs proposal is to develop disruptive technology and ground-breaking design innovations with amorphous oxide TFTs on plastic substrates, targeting large scale or very large scale flexible integrated circuits with unprecedented characteristics in terms of power consumption, supply voltage and operating speed, for applications in IoT and wearable healthcare sensor patches.
We introduce a new logic style, “quasi-CMOS”, which is based on unipolar, oxide dual-gate thin-film transistors. This logic style will drastically decrease the power consumption of unipolar logic gates in a novel way by taking advantage of dynamic backgate driving and of the transistor’s unique low off-state leakage current, without compromising on switching speed. In addition, we also introduce downscaling of the transistor’s dimensions, while remaining compatible with upscaling to large-area manufacturing platforms. Finally, we will investigate novel ultralow-power design techniques on system-level, while exploiting the quasi-CMOS logic gates.
We will demonstrate the power of this innovation with circuits for item-level Internet-of-Things, UHF RFID, and wearable health sensor patches.

Thin-film transistor technologies are present in many products today that require an active transistor backplane e.g. flat-panel displays and flat-panel photodetector arrays. Unipolar n-type transistors based on amorphous Indium-Gallium-Zinc-Oxide (a-IGZO) as semiconductor is currently the most promising technology for next generation products demanding a high-performant, low power transistor, manufacturable on flexible substrates enabling curved, bendable and even rollable displays. a-IGZO is a wide bandgap material characterized by extremely low off-state leakage currents and electron mobility of ~20 cm2/Vs. IGZO transistors fabricated on flexible substrates will also find their use in applications that require flexible integrated circuits.
The goal of this FLICs proposal is to develop disruptive technology and ground-breaking design innovations with amorphous oxide TFTs on plastic substrates, targeting large scale or very large scale flexible integrated circuits with unprecedented characteristics in terms of power consumption, supply voltage and operating speed, for applications in IoT and wearable healthcare sensor patches.
We introduce a new logic style, “quasi-CMOS”, which is based on unipolar, oxide dual-gate thin-film transistors. This logic style will drastically decrease the power consumption of unipolar logic gates in a novel way by taking advantage of dynamic backgate driving and of the transistor’s unique low off-state leakage current, without compromising on switching speed. In addition, we also introduce downscaling of the transistor’s dimensions, while remaining compatible with upscaling to large-area manufacturing platforms. Finally, we will investigate novel ultralow-power design techniques on system-level, while exploiting the quasi-CMOS logic gates.
We will demonstrate the power of this innovation with circuits for item-level Internet-of-Things, UHF RFID, and wearable health sensor patches.

Max ERC Funding

1 499 155 €

Duration

Start date: 2017-01-01, End date: 2021-12-31

Project acronymFLUOROCODE

ProjectFLUOROCODE: a super-resolution optical map of DNA

Researcher (PI)Johan M. V. Hofkens

Host Institution (HI)KATHOLIEKE UNIVERSITEIT LEUVEN

Call DetailsAdvanced Grant (AdG), PE4, ERC-2011-ADG_20110209

Summary"There has been an immense investment of time, effort and resources in the development of the technologies that enable DNA sequencing in the past 10 years. Despite the significant advances made, all of the current genomic sequencing technologies suffer from two important shortcomings. Firstly, sample preparation is time-consuming and expensive, and requiring a full day for sample preparation for next-generation sequencing experiments. Secondly, sequence information is delivered in short fragments, which are then assembled into a complete genome. Assembly is time-consuming and often results in a highly fragmented genomic sequence and the loss of important information on large-scale structural variation within the genome.
We recently developed a super-resolution DNA mapping technology, which allows us to uniquely study genetic-scale features in genomic length DNA molecules. Labelling the DNA with fluorescent molecules at specific sequences and using high-resolution fluorescence microscopy enabled us to produce a map of a genomic DNA sequence with unparalleled resolution, the so called FLUOROCODE. In this project we aim to extend our methodology to map longer DNA molecules and to include a multi-colour version of the FLUOROCODE that will allow us to read genomic DNA molecules like a barcode and probe DNA methylation status. The sample preparation, DNA labelling and deposition for imaging will be integrated to allow rapid mapping of DNA molecules. At the same time nanopores will be explored as a route to high-throughput DNA mapping.
FLUOROCODE will develop technology that aims to complement the information derived from current DNA sequencing platforms. The technology developed by FLUOROCODE will enable DNA mapping at unprecedented speed and for a fraction of the cost of a typical DNA sequencing project. We aniticipate that our method will find applications in the rapid identification of pathogens and in producing genomic scaffolds to improve genome sequence assembly."

"There has been an immense investment of time, effort and resources in the development of the technologies that enable DNA sequencing in the past 10 years. Despite the significant advances made, all of the current genomic sequencing technologies suffer from two important shortcomings. Firstly, sample preparation is time-consuming and expensive, and requiring a full day for sample preparation for next-generation sequencing experiments. Secondly, sequence information is delivered in short fragments, which are then assembled into a complete genome. Assembly is time-consuming and often results in a highly fragmented genomic sequence and the loss of important information on large-scale structural variation within the genome.
We recently developed a super-resolution DNA mapping technology, which allows us to uniquely study genetic-scale features in genomic length DNA molecules. Labelling the DNA with fluorescent molecules at specific sequences and using high-resolution fluorescence microscopy enabled us to produce a map of a genomic DNA sequence with unparalleled resolution, the so called FLUOROCODE. In this project we aim to extend our methodology to map longer DNA molecules and to include a multi-colour version of the FLUOROCODE that will allow us to read genomic DNA molecules like a barcode and probe DNA methylation status. The sample preparation, DNA labelling and deposition for imaging will be integrated to allow rapid mapping of DNA molecules. At the same time nanopores will be explored as a route to high-throughput DNA mapping.
FLUOROCODE will develop technology that aims to complement the information derived from current DNA sequencing platforms. The technology developed by FLUOROCODE will enable DNA mapping at unprecedented speed and for a fraction of the cost of a typical DNA sequencing project. We aniticipate that our method will find applications in the rapid identification of pathogens and in producing genomic scaffolds to improve genome sequence assembly."

Max ERC Funding

2 423 160 €

Duration

Start date: 2012-09-01, End date: 2017-08-31

Project acronymHELIOS

ProjectHeavy Element Laser Ionization Spectroscopy

Researcher (PI)Pieter Van Duppen

Host Institution (HI)KATHOLIEKE UNIVERSITEIT LEUVEN

Call DetailsAdvanced Grant (AdG), PE2, ERC-2011-ADG_20110209

SummaryThe aim of this proposal is to develop a novel laser-spectroscopy method and to study nuclear and atomic properties of heaviests elements in order to address the following key questions:
- Is the existence of the heaviest isotopes determined by the interplay between single-particle and collective nucleon degrees of freedom in the atomic nucleus?
- How do relativistic effects and isotopic composition influence the valence atomic structure of the heaviest elements?
The new approach is based on in-gas jet, high-repetition, high-resolution laser resonance ionization spectroscopy of short-lived nuclear-reaction products stopped in a buffer gas cell. The final goal is to couple the new system to the strongest production facility under construction at the ESFRI-listed SPIRAL-2 facility at GANIL (France) and to study isotopes from actinium to nobelium and heavier elements.
An increase of the primary intensity, efficiency, selectivity and spectral resolution by one order of magnitude compared to present-day techniques is envisaged, which is essential to obtain the required data .
The challenges are:
- decoupling the high-intensity heavy ion production beam (> 10^14 particles per second) from the low-intensity reaction products (few atoms per second)
- cooling of the reaction products from MeV/u to meV/u within less then hundred milliseconds
- separating the wanted from the, by orders of magnitude overwhelming, unwanted isotopes
- performing high-resolution laser spectroscopy on a minute amount of atoms in an efficient way.
Nuclear properties (charge radii, nuclear moments and spins) as well as atomic properties (transition energies and ionization potentials) will be deduced in regions of the nuclear chart where they are not known: the neutron-deficient isotopes of the actinide elements, up to nobelium (Z = 102) and beyond. The data will validate state-of-the-art calculations, identify critical weaknesses and guide further theoretical developments.

The aim of this proposal is to develop a novel laser-spectroscopy method and to study nuclear and atomic properties of heaviests elements in order to address the following key questions:
- Is the existence of the heaviest isotopes determined by the interplay between single-particle and collective nucleon degrees of freedom in the atomic nucleus?
- How do relativistic effects and isotopic composition influence the valence atomic structure of the heaviest elements?
The new approach is based on in-gas jet, high-repetition, high-resolution laser resonance ionization spectroscopy of short-lived nuclear-reaction products stopped in a buffer gas cell. The final goal is to couple the new system to the strongest production facility under construction at the ESFRI-listed SPIRAL-2 facility at GANIL (France) and to study isotopes from actinium to nobelium and heavier elements.
An increase of the primary intensity, efficiency, selectivity and spectral resolution by one order of magnitude compared to present-day techniques is envisaged, which is essential to obtain the required data .
The challenges are:
- decoupling the high-intensity heavy ion production beam (> 10^14 particles per second) from the low-intensity reaction products (few atoms per second)
- cooling of the reaction products from MeV/u to meV/u within less then hundred milliseconds
- separating the wanted from the, by orders of magnitude overwhelming, unwanted isotopes
- performing high-resolution laser spectroscopy on a minute amount of atoms in an efficient way.
Nuclear properties (charge radii, nuclear moments and spins) as well as atomic properties (transition energies and ionization potentials) will be deduced in regions of the nuclear chart where they are not known: the neutron-deficient isotopes of the actinide elements, up to nobelium (Z = 102) and beyond. The data will validate state-of-the-art calculations, identify critical weaknesses and guide further theoretical developments.