Host Institution (HI)IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE

Call DetailsConsolidator Grant (CoG), PE2, ERC-2015-CoG

SummaryThe unique properties of a new type of X-ray source produced by a compact laser-plasma accelerator will be used to probe the ultra-fast dynamics of the electronic structure of matter under extreme conditions.
The TeX-MEx project will study: 1) hot dense matter, such as that found at the centre of the Sun; 2) warm dense matter such as that found at the centre of Jupiter and 3) photo-ionized plasmas far from equilibrium such as is found in the exotic environment of an accretion disk surrounding a black hole. These extreme conditions will be created in the laboratory using 1) direct laser heating, 2) proton heating and laser driven shock heating and 3) intense X-ray pumping using the betatron source itself and the extraordinary X-ray fluxes available with a free electron laser.
Using the unique combination of a few-femtosecond duration and broad spectral coverage that the X-rays produced by a laser wakefield accelerator possess, the TeX-MEx project will explore new physics in each of these regimes. For example we will be able to directly measure the rates of ionization of hot dense matter for the first time; we will observe the onset of ion motion in warm dense matter and how this affects the electron energy levels; we will make the first observations of non-collisional photo-ionized plasmas. These will allow us to accurately test and develop models used to describe matter under extreme conditions in the laboratory and in astrophysics.
This integrated program of innovative experiments and new approaches to modeling will open up a new field of femtosecond time-resolved absorption spectroscopy of matter under extreme conditions and will drastically improve our understanding of how matter behaves throughout our Universe. It will, for the first time, bring to our laboratories on Earth the ability to probe some of Nature's most violent processes, to date only hinted at in data from a new generation of astronomical instruments.

The unique properties of a new type of X-ray source produced by a compact laser-plasma accelerator will be used to probe the ultra-fast dynamics of the electronic structure of matter under extreme conditions.
The TeX-MEx project will study: 1) hot dense matter, such as that found at the centre of the Sun; 2) warm dense matter such as that found at the centre of Jupiter and 3) photo-ionized plasmas far from equilibrium such as is found in the exotic environment of an accretion disk surrounding a black hole. These extreme conditions will be created in the laboratory using 1) direct laser heating, 2) proton heating and laser driven shock heating and 3) intense X-ray pumping using the betatron source itself and the extraordinary X-ray fluxes available with a free electron laser.
Using the unique combination of a few-femtosecond duration and broad spectral coverage that the X-rays produced by a laser wakefield accelerator possess, the TeX-MEx project will explore new physics in each of these regimes. For example we will be able to directly measure the rates of ionization of hot dense matter for the first time; we will observe the onset of ion motion in warm dense matter and how this affects the electron energy levels; we will make the first observations of non-collisional photo-ionized plasmas. These will allow us to accurately test and develop models used to describe matter under extreme conditions in the laboratory and in astrophysics.
This integrated program of innovative experiments and new approaches to modeling will open up a new field of femtosecond time-resolved absorption spectroscopy of matter under extreme conditions and will drastically improve our understanding of how matter behaves throughout our Universe. It will, for the first time, bring to our laboratories on Earth the ability to probe some of Nature's most violent processes, to date only hinted at in data from a new generation of astronomical instruments.

Max ERC Funding

1 996 316 €

Duration

Start date: 2016-07-01, End date: 2021-06-30

Project acronymThDEFINE

ProjectRe(defining) CD4+ T Cell Identities One Cell at a Time

Researcher (PI)Sarah Amalia Polonius-Teichmann

Host Institution (HI)GENOME RESEARCH LIMITED

Call DetailsConsolidator Grant (CoG), LS2, ERC-2014-CoG

SummaryThe immune system consists of a complex continuum of cell types that communicate with each other and non-immune tissues in homeostasis, and during infections, autoimmunity and cancer. Conventional transcriptional and functional profiling enabled by cell surface marker sorting has revealed a great deal about how specific cell types operate en masse, yet important transcriptional heterogeneity that exists within cell populations remains unexplored. High-throughput single cell RNA-seq can overcome this limitation by profiling entire transcriptomes of thousands of individual cells, revealing cell-to-cell variation by decoding patterns within populations masked in bulk transcriptomes. We will exploit this to dissect the mouse CD4+ T cell compartment, a heterogeneous white blood cell population that initiates adaptive immune responses.
In AIM 1, we will chart the dynamics of in vivo CD4+ cell states in mouse before, during and after immune response challenges. By sequencing thousands of single cell transcriptomes, we will map the landscape of CD4+ T cell states in an unbiased, quantitative and comprehensive way.
In AIM 2, we will predict key transcription factors, cell surface markers, and signalling molecules, including cytokines/chemokines in each cell state through novel computational approaches. Furthermore, our analyses will establish regulatory modules and networks of gene-gene interactions active in immune responses.
In AIM 3, we will (a) confirm the in vivo impact of new cell states by performing adoptive cell transfer assays; and
(b) validate our predictions of regulatory molecules and interactions using a massively parallel CRISPR/Cas knockout screen in vitro.
This powerful integrated approach combines single cell RNA-sequencing, bioinformatics and genetic engineering to dissect CD4+ T cell states, a central compartment of mammalian adaptive immunity, and reveal basic principles of gene regulation.

The immune system consists of a complex continuum of cell types that communicate with each other and non-immune tissues in homeostasis, and during infections, autoimmunity and cancer. Conventional transcriptional and functional profiling enabled by cell surface marker sorting has revealed a great deal about how specific cell types operate en masse, yet important transcriptional heterogeneity that exists within cell populations remains unexplored. High-throughput single cell RNA-seq can overcome this limitation by profiling entire transcriptomes of thousands of individual cells, revealing cell-to-cell variation by decoding patterns within populations masked in bulk transcriptomes. We will exploit this to dissect the mouse CD4+ T cell compartment, a heterogeneous white blood cell population that initiates adaptive immune responses.
In AIM 1, we will chart the dynamics of in vivo CD4+ cell states in mouse before, during and after immune response challenges. By sequencing thousands of single cell transcriptomes, we will map the landscape of CD4+ T cell states in an unbiased, quantitative and comprehensive way.
In AIM 2, we will predict key transcription factors, cell surface markers, and signalling molecules, including cytokines/chemokines in each cell state through novel computational approaches. Furthermore, our analyses will establish regulatory modules and networks of gene-gene interactions active in immune responses.
In AIM 3, we will (a) confirm the in vivo impact of new cell states by performing adoptive cell transfer assays; and
(b) validate our predictions of regulatory molecules and interactions using a massively parallel CRISPR/Cas knockout screen in vitro.
This powerful integrated approach combines single cell RNA-sequencing, bioinformatics and genetic engineering to dissect CD4+ T cell states, a central compartment of mammalian adaptive immunity, and reveal basic principles of gene regulation.

Max ERC Funding

1 980 685 €

Duration

Start date: 2016-01-01, End date: 2020-12-31

Project acronymTheHiggsAndThe7Tops

ProjectMirror Mirror on the Wall, which Higgs is the oddest of them all: Exploring the Top-Higgs Interconnection with ATLAS

Researcher (PI)Reinhild Fatima Yvonne PETERS

Host Institution (HI)THE UNIVERSITY OF MANCHESTER

Call DetailsConsolidator Grant (CoG), PE2, ERC-2018-COG

SummaryWith the ground-breaking discovery of a new scalar particle, the Higgs boson, in 2012 by ATLAS and CMS, the standard model (SM) of particle physics has been completed. Despite this success, many open questions on the fundamental laws of nature remain unanswered. Among these are how exactly particles acquire their mass and why there is more matter than antimatter in the universe.
One of the most promising avenues to approach these questions is to explore the relation between the Higgs boson and the heaviest known elementary particle: the top quark. Due to its large mass, the top is expected to play a special role in the mechanism of electroweak symmetry breaking. In order to shed light onto this mechanism, understanding the coupling between the top and the Higgs in great detail and exploring the charge and parity (CP) nature of the Higgs are essential. While the SM Higgs boson is CP-even, many models beyond the SM require a CP-odd component. Higgs-top couplings are expected to provide an unambiguous probe of CP-mixed states.
I will explore for the first time all processes in which a direct determination of the top-Higgs interconnection is feasible, in particular events where the Higgs is produced in association with 1, 2 and 4 top quarks. These are among the most challenging channels at the LHC. I will pioneer a comprehensive programme, consisting of the development of powerful event-reconstruction methods and improved boosting techniques, allowing the first exploitation of novel variables in a beyond-state-of-the-art cross-process analysis, thus unravel the CP-properties in the top-Higgs interaction.
The ultimate goal of the project is the precise direct measurement of the top-Higgs Yukawa coupling, and the first determination of the CP-nature of the Higgs boson in fermion interactions. Confronting these results with the SM and models that go beyond the SM will yield an unprecedented insight into the origin of mass of elementary particles.

With the ground-breaking discovery of a new scalar particle, the Higgs boson, in 2012 by ATLAS and CMS, the standard model (SM) of particle physics has been completed. Despite this success, many open questions on the fundamental laws of nature remain unanswered. Among these are how exactly particles acquire their mass and why there is more matter than antimatter in the universe.
One of the most promising avenues to approach these questions is to explore the relation between the Higgs boson and the heaviest known elementary particle: the top quark. Due to its large mass, the top is expected to play a special role in the mechanism of electroweak symmetry breaking. In order to shed light onto this mechanism, understanding the coupling between the top and the Higgs in great detail and exploring the charge and parity (CP) nature of the Higgs are essential. While the SM Higgs boson is CP-even, many models beyond the SM require a CP-odd component. Higgs-top couplings are expected to provide an unambiguous probe of CP-mixed states.
I will explore for the first time all processes in which a direct determination of the top-Higgs interconnection is feasible, in particular events where the Higgs is produced in association with 1, 2 and 4 top quarks. These are among the most challenging channels at the LHC. I will pioneer a comprehensive programme, consisting of the development of powerful event-reconstruction methods and improved boosting techniques, allowing the first exploitation of novel variables in a beyond-state-of-the-art cross-process analysis, thus unravel the CP-properties in the top-Higgs interaction.
The ultimate goal of the project is the precise direct measurement of the top-Higgs Yukawa coupling, and the first determination of the CP-nature of the Higgs boson in fermion interactions. Confronting these results with the SM and models that go beyond the SM will yield an unprecedented insight into the origin of mass of elementary particles.

SummaryDensity-functional theory (DFT) is the most widely used method to study the electronic structure of complex molecules, solids, and materials. Its use across chemistry, solid-state physics and materials science is a testament to its black-box nature and low cost. However, many important areas remain inaccessible to DFT simulations, including applications to strongly correlated materials and systems in electromagnetic fields. The topDFT project will deliver new conceptual approaches to design the next generation of density-functional methods. This will be achieved by pursuing three parallel strategies: i) Developing new strategies for the design of functionals ii) Implementing topological DFT, a new computational framework iii) Developing extended density-functional theories.
A new approach to the exchange–correlation problem, based on a perspective from the kinetic energy of the electrons, will be developed – leading to new practical density-functional approximations (DFAs). A new framework for computation will be developed by combining techniques from topological electronic structure methods with DFT, allowing for the identification of correlation ‘hotspots’. This idea is chemically intuitive; electrons close together interact in a fundamentally different way to those far apart. Recognising these hotspots, and adapting dynamically to them, will lead to new DFAs with substantially greater accuracy.
Extended-DFTs will open the way to study strongly correlated systems (e.g. high-Tc superconductors, transition metal oxides, Mott insulators) of importance in chemistry and materials science and magnetic systems (e.g. molecular magnets, spin glasses, spin frustrated systems) of importance in nano-science, advanced materials and spintronics applications. The topDFT project will have wide impact on areas including chemical synthesis, materials design and nano-science that underpin key areas such as manufacturing and medicine of benefit to all sections of society.

Density-functional theory (DFT) is the most widely used method to study the electronic structure of complex molecules, solids, and materials. Its use across chemistry, solid-state physics and materials science is a testament to its black-box nature and low cost. However, many important areas remain inaccessible to DFT simulations, including applications to strongly correlated materials and systems in electromagnetic fields. The topDFT project will deliver new conceptual approaches to design the next generation of density-functional methods. This will be achieved by pursuing three parallel strategies: i) Developing new strategies for the design of functionals ii) Implementing topological DFT, a new computational framework iii) Developing extended density-functional theories.
A new approach to the exchange–correlation problem, based on a perspective from the kinetic energy of the electrons, will be developed – leading to new practical density-functional approximations (DFAs). A new framework for computation will be developed by combining techniques from topological electronic structure methods with DFT, allowing for the identification of correlation ‘hotspots’. This idea is chemically intuitive; electrons close together interact in a fundamentally different way to those far apart. Recognising these hotspots, and adapting dynamically to them, will lead to new DFAs with substantially greater accuracy.
Extended-DFTs will open the way to study strongly correlated systems (e.g. high-Tc superconductors, transition metal oxides, Mott insulators) of importance in chemistry and materials science and magnetic systems (e.g. molecular magnets, spin glasses, spin frustrated systems) of importance in nano-science, advanced materials and spintronics applications. The topDFT project will have wide impact on areas including chemical synthesis, materials design and nano-science that underpin key areas such as manufacturing and medicine of benefit to all sections of society.

Max ERC Funding

1 998 649 €

Duration

Start date: 2018-05-01, End date: 2023-04-30

Project acronymTOPOLOGICAL

ProjectTopological Light at Structured Surfaces

Researcher (PI)Shuang Zhang

Host Institution (HI)THE UNIVERSITY OF BIRMINGHAM

Call DetailsConsolidator Grant (CoG), PE2, ERC-2014-CoG

SummaryBy using metamaterials and metasurfaces as the platform, this proposal focuses on the novel topological physics and applications introduced by Berry phase. The flexibility in engineering the artificial ‘atoms’ and ‘molecules’ of metamaterials provides unlimited possibilities to create new structural effect where symmetry (or symmetry breaking) and topology play critical roles. We are particularly interested in the role Berry phase plays in various nontrivial surface optical effects, including topological surface states and spin Hall effect of light. The investigation of the scattering immune surface states in a topological metamaterial, i.e. an effective medium approach, acts to unify the spin Hall effect of light with the more unconventional scheme of topological orders and protected surface states. We will further exploit Berry phase in the nonlinear regime, in particular harmonic generations, to control the nonlinear coefficients to an unprecedented level. Hence our study on Berry phase in the nonlinear regime will point to a new research direction on nonlinearity coefficient engineering, which will have important impact in the area of nonlinear optics. The proposal also investigates into practical applications brought by a novel type of geometrical metasurfaces, where the phase and hence the wavefront are finely controlled by the Berry phase in a highly robust manner. The proposal involves the development of innovative synthesis technologies, theoretical analysis, numerical simulations, experimental characterizations, and device development. The new symmetry and topological effects in this research will greatly impact a number of disciplines including material science, condensed matter physics and photonics.

By using metamaterials and metasurfaces as the platform, this proposal focuses on the novel topological physics and applications introduced by Berry phase. The flexibility in engineering the artificial ‘atoms’ and ‘molecules’ of metamaterials provides unlimited possibilities to create new structural effect where symmetry (or symmetry breaking) and topology play critical roles. We are particularly interested in the role Berry phase plays in various nontrivial surface optical effects, including topological surface states and spin Hall effect of light. The investigation of the scattering immune surface states in a topological metamaterial, i.e. an effective medium approach, acts to unify the spin Hall effect of light with the more unconventional scheme of topological orders and protected surface states. We will further exploit Berry phase in the nonlinear regime, in particular harmonic generations, to control the nonlinear coefficients to an unprecedented level. Hence our study on Berry phase in the nonlinear regime will point to a new research direction on nonlinearity coefficient engineering, which will have important impact in the area of nonlinear optics. The proposal also investigates into practical applications brought by a novel type of geometrical metasurfaces, where the phase and hence the wavefront are finely controlled by the Berry phase in a highly robust manner. The proposal involves the development of innovative synthesis technologies, theoretical analysis, numerical simulations, experimental characterizations, and device development. The new symmetry and topological effects in this research will greatly impact a number of disciplines including material science, condensed matter physics and photonics.

Max ERC Funding

1 997 600 €

Duration

Start date: 2015-12-01, End date: 2020-11-30

Project acronymTRANSLATE

ProjectSpecificity of translational control during unfolded protein response

Researcher (PI)Jernej Ule

Host Institution (HI)UNIVERSITY COLLEGE LONDON

Call DetailsConsolidator Grant (CoG), LS1, ERC-2013-CoG

SummaryUnfolded protein response (UPR) is activated by multiple types of cellular stress, and can promote either cell survival or apoptosis. The balance between these opposing outcomes is delicately regulated, and when lost, contributes to diverse diseases. UPR enables cells to halt general translation, while inducing translation and transcription of specific mRNAs that escape repression. Even though the general machinery controlling translation is well understood, several fundamental open questions remain: 1) how are mRNAs selected for translation during UPR, 2) what role does mRNA structure and sequence play in this selection, 3) what role does UPR pathway play in the highly differentiated cells, such as neurons? My lab employs an integrative approach to understand how RNA-binding proteins (RBPs) control specific mRNAs. We recently developed hiCLIP, a method that globally quantifies interactions between RBPs and double-stranded RNA in live cells. Our preliminary findings demonstrate that a double-stranded RBP binds to structured motifs in mRNAs to control stress-induced translation. I propose to determine how combinatorial recognition of RNA sequence and structure by RBPs controls mRNA localisation, stability and translation during UPR. In addition, we will assess the role of UPR pathway in neuronal differentiation. Taken together, this study aims to elucidate how cells select specific mRNAs for translation, and thereby survive during stress or respond to signals that control differentiation.

Unfolded protein response (UPR) is activated by multiple types of cellular stress, and can promote either cell survival or apoptosis. The balance between these opposing outcomes is delicately regulated, and when lost, contributes to diverse diseases. UPR enables cells to halt general translation, while inducing translation and transcription of specific mRNAs that escape repression. Even though the general machinery controlling translation is well understood, several fundamental open questions remain: 1) how are mRNAs selected for translation during UPR, 2) what role does mRNA structure and sequence play in this selection, 3) what role does UPR pathway play in the highly differentiated cells, such as neurons? My lab employs an integrative approach to understand how RNA-binding proteins (RBPs) control specific mRNAs. We recently developed hiCLIP, a method that globally quantifies interactions between RBPs and double-stranded RNA in live cells. Our preliminary findings demonstrate that a double-stranded RBP binds to structured motifs in mRNAs to control stress-induced translation. I propose to determine how combinatorial recognition of RNA sequence and structure by RBPs controls mRNA localisation, stability and translation during UPR. In addition, we will assess the role of UPR pathway in neuronal differentiation. Taken together, this study aims to elucidate how cells select specific mRNAs for translation, and thereby survive during stress or respond to signals that control differentiation.

Max ERC Funding

1 999 435 €

Duration

Start date: 2014-03-01, End date: 2019-02-28

Project acronymUbiArchitect

ProjectUnderstanding the complexity and architecture in protein ubiquitination

Researcher (PI)David KOMANDER

Host Institution (HI)MEDICAL RESEARCH COUNCIL

Call DetailsConsolidator Grant (CoG), LS1, ERC-2016-COG

SummaryThe posttranslational modification of proteins with polyubiquitin regulates virtually all aspects of cell biology. This versatility arises from eight distinct linkage types between individual ubiquitin moieties in polyubiquitin, which co-exist in cells, are independently regulated, and eventually determine the fate of the modified protein. However, ubiquitin chain architecture can be highly complex, and the extent of ‘chain branching’ is unknown. Moreover, ubiquitin also undergoes phosphorylation and acetylation, which can dramatically alter its function.
A true appreciation of the complexity in the ubiquitin code can only be achieved when all above aspects are considered, and only then will it be possible to assign cellular readouts to distinct ubiquitination events and differentiate between ubiquitin signals in cells.
While the complexity of ubiquitination is daunting, work from many laboratories including my own has exemplified how basic biochemistry, a detailed understanding of mechanism and quantitative mass-spectrometry allows us to study, and eventually understand, the ubiquitin code.
In this proposal, new methods and approaches are outlined that will allow a detailed monitoring of polyubiquitin chain architectures from cellular samples (AIM 1), and also lead to an in depth understanding of additional posttranslational modifications, such as ubiquitin phosphorylation and acetylation in cells (AIM 2). Moreover, new research tools for unstudied K6- and K33-linked polyubiquitin will give insights into cellular roles for these linkage types (AIM 3). Our studies will focus on ubiquitination events on mitochondria leading to mitophagy, where unstudied K6-linked chains as well as phospho-ubiquitin are part of complex chain architectures, and mechanisms of signalling are still unclear. Our work will reveal fundamental principles in ubiquitination, and are of high medical relevance due to the links to Parkinson’s disease, infectious disease, and cancer.

The posttranslational modification of proteins with polyubiquitin regulates virtually all aspects of cell biology. This versatility arises from eight distinct linkage types between individual ubiquitin moieties in polyubiquitin, which co-exist in cells, are independently regulated, and eventually determine the fate of the modified protein. However, ubiquitin chain architecture can be highly complex, and the extent of ‘chain branching’ is unknown. Moreover, ubiquitin also undergoes phosphorylation and acetylation, which can dramatically alter its function.
A true appreciation of the complexity in the ubiquitin code can only be achieved when all above aspects are considered, and only then will it be possible to assign cellular readouts to distinct ubiquitination events and differentiate between ubiquitin signals in cells.
While the complexity of ubiquitination is daunting, work from many laboratories including my own has exemplified how basic biochemistry, a detailed understanding of mechanism and quantitative mass-spectrometry allows us to study, and eventually understand, the ubiquitin code.
In this proposal, new methods and approaches are outlined that will allow a detailed monitoring of polyubiquitin chain architectures from cellular samples (AIM 1), and also lead to an in depth understanding of additional posttranslational modifications, such as ubiquitin phosphorylation and acetylation in cells (AIM 2). Moreover, new research tools for unstudied K6- and K33-linked polyubiquitin will give insights into cellular roles for these linkage types (AIM 3). Our studies will focus on ubiquitination events on mitochondria leading to mitophagy, where unstudied K6-linked chains as well as phospho-ubiquitin are part of complex chain architectures, and mechanisms of signalling are still unclear. Our work will reveal fundamental principles in ubiquitination, and are of high medical relevance due to the links to Parkinson’s disease, infectious disease, and cancer.

Max ERC Funding

1 990 125 €

Duration

Start date: 2017-10-01, End date: 2022-09-30

Project acronymUQMSI

ProjectUncertainty Quantification and Modern Statistical Inference

Researcher (PI)Richard Nickl

Host Institution (HI)THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE

Call DetailsConsolidator Grant (CoG), PE1, ERC-2014-CoG

SummarySome of the most important and exciting challenges of our ‘information age’ have led to the development of novel statistical methodology and algorithms that are designed to deal with inference settings involving high-dimensionality, graphical and network structures, inverse problems, ‘big data’, stochastic differential equations, diffusion processes, cosmic microwave background maps, brain tomography etc.
While an abundance of algorithms is now available, a scientifically rigorous theory of uncertainty quantification and statistical decision making for such procedures has not been developed yet. Traditional approaches such as maximum likelihood estimation or parametric Bayesian inference cannot be used naively in increasingly complex contemporary statistical models. The construction of confidence statements and critical values for significant hypothesis tests is, however, of crucial importance for all applications of the statistical sciences to the modern world.
In this research we propose an objective, mathematically rigorous, and practical paradigm for uncertainty quantification in modern statistical inference problems, and illustrate how this approach can be used in some of the recently emerged areas of statistics. Our theory can validate both Bayesian and frequentist approaches to statistical inference, and can be expected to be optimal in an information-theoretic sense. It has potential impact on all areas of scientific theory building, on legal and medical practice, public management of the internet, modern media and other information structures, and also on the foundations of the mathematical discipline of statistics in itself.

Some of the most important and exciting challenges of our ‘information age’ have led to the development of novel statistical methodology and algorithms that are designed to deal with inference settings involving high-dimensionality, graphical and network structures, inverse problems, ‘big data’, stochastic differential equations, diffusion processes, cosmic microwave background maps, brain tomography etc.
While an abundance of algorithms is now available, a scientifically rigorous theory of uncertainty quantification and statistical decision making for such procedures has not been developed yet. Traditional approaches such as maximum likelihood estimation or parametric Bayesian inference cannot be used naively in increasingly complex contemporary statistical models. The construction of confidence statements and critical values for significant hypothesis tests is, however, of crucial importance for all applications of the statistical sciences to the modern world.
In this research we propose an objective, mathematically rigorous, and practical paradigm for uncertainty quantification in modern statistical inference problems, and illustrate how this approach can be used in some of the recently emerged areas of statistics. Our theory can validate both Bayesian and frequentist approaches to statistical inference, and can be expected to be optimal in an information-theoretic sense. It has potential impact on all areas of scientific theory building, on legal and medical practice, public management of the internet, modern media and other information structures, and also on the foundations of the mathematical discipline of statistics in itself.

Max ERC Funding

1 733 767 €

Duration

Start date: 2015-09-01, End date: 2021-02-28

Project acronymVort3DEuler

Project3D Euler, Vortex Dynamics and PDE

Researcher (PI)Jose Luis Rodrigo

Host Institution (HI)THE UNIVERSITY OF WARWICK

Call DetailsConsolidator Grant (CoG), PE1, ERC-2013-CoG

SummaryThis proposal deals with a collection of problems in PDE arising from fluid mechanics.The primary motivation is the understanding of the evolution of isolated vortex lines for 3D Euler. The importance of the evolution of vorticity in incompressible fluid mechanics is very well known.
To date, only nonrigorous approaches are known to try to obtain an evolution equation for isolated vortex lines. Two desingularization procedures are carried out (including a time renormalization) to obtain an evolution equation (the binormal equation). While an isolated vortex line does not fit any known concept of solution (given the singularity of the velocity), and there has been significant recent activity on the nonuniqueness of solutions of Euler (De Lellis & Szekelyhidi, and recently Isett) it is expected that the geometric assumptions made about the solution will still make it possible to find a suitable concept of solution. In the proposal I describe an approach that should help to rigorously understand vortex lines. It is motivated by a programme developed for the Surface Quasi-Geostrophic (SQG) equation with C. Fefferman and for some related desingularized models with my student Zoe Atkins (Nov 2012 PhD).
SQG has been of great interest in the PDE community due to the striking similarities it exhibits with 3D Euler. In particular, the evolution of sharp fronts for SQG corresponds to the evolution of vortex lines. In recent years I have developed an approach that overcomes the divergences known to exist for the velocity field (as in 3D Euler). The positive results obtained for SQG motivate the methodology and tools described in the proposal, including the construction of solutions with very large gradients and simple geometry and the use of a measure-theoretic approach to identify fundamental curves within these objects. Surprising connections with other equations motivate some other directions and linked projects, for example with Prandtl and boundary layer ther theory.

This proposal deals with a collection of problems in PDE arising from fluid mechanics.The primary motivation is the understanding of the evolution of isolated vortex lines for 3D Euler. The importance of the evolution of vorticity in incompressible fluid mechanics is very well known.
To date, only nonrigorous approaches are known to try to obtain an evolution equation for isolated vortex lines. Two desingularization procedures are carried out (including a time renormalization) to obtain an evolution equation (the binormal equation). While an isolated vortex line does not fit any known concept of solution (given the singularity of the velocity), and there has been significant recent activity on the nonuniqueness of solutions of Euler (De Lellis & Szekelyhidi, and recently Isett) it is expected that the geometric assumptions made about the solution will still make it possible to find a suitable concept of solution. In the proposal I describe an approach that should help to rigorously understand vortex lines. It is motivated by a programme developed for the Surface Quasi-Geostrophic (SQG) equation with C. Fefferman and for some related desingularized models with my student Zoe Atkins (Nov 2012 PhD).
SQG has been of great interest in the PDE community due to the striking similarities it exhibits with 3D Euler. In particular, the evolution of sharp fronts for SQG corresponds to the evolution of vortex lines. In recent years I have developed an approach that overcomes the divergences known to exist for the velocity field (as in 3D Euler). The positive results obtained for SQG motivate the methodology and tools described in the proposal, including the construction of solutions with very large gradients and simple geometry and the use of a measure-theoretic approach to identify fundamental curves within these objects. Surprising connections with other equations motivate some other directions and linked projects, for example with Prandtl and boundary layer ther theory.

Max ERC Funding

1 182 858 €

Duration

Start date: 2014-07-01, End date: 2019-06-30

Project acronymWallCrossAG

ProjectWall-Crossing and Algebraic Geometry

Researcher (PI)Arend BAYER

Host Institution (HI)THE UNIVERSITY OF EDINBURGH

Call DetailsConsolidator Grant (CoG), PE1, ERC-2018-COG

SummaryWe will establish stability conditions and wall-crossing in derived categories as a standard methodology for a wide range of fundamental problems in algebraic geometry. Previous work based on wall-crossing, in particular my joint work with Macri, has led to breakthroughs on the birational geometry of moduli spaces and related varieties. Recent advances have made clear that the power of stability conditions extends far beyond this setting, allowing us to study vanishing theorems or bounds on global sections, Brill-Noether problems, or moduli spaces of varieties.
The Brill-Noether problem is one of the oldest and most fundamental questions of algebraic geometry, aiming to classify possible degrees and embedding dimensions of embeddings of a given variety into projective spaces. Recent work by myself, a post-doc (Chunyi Li) and a PhD student (Feyzbakhsh) of mine has established wall-crossing as a powerful new method for such questions. We will push this method further, all the way towards a proof of Green's conjecture, and the Green-Lazarsfeld conjecture, for all smooth curves.
We will use similar methods to prove new Bogomolov-Gieseker type inequalities for Chern classes of stable sheaves and complexes on higher-dimensional varieties. In addition to constructing stability conditions on projective threefolds---the biggest open problem within the theory of stability conditions, we will apply them to study moduli spaces of sheaves on higher-dimensional varieties, and to characterise special abelian varieties.
We will use the construction of stability conditions for families of varieties in my current joint work to systematically study the geometry of Fano threefolds and fourfolds, in particular their moduli spaces, by establishing relations between different moduli spaces, and describing their Torelli maps. Finally, we will study rationality questions, with a particular view towards a wall-crossing proof of the irrationality of the very general cubic fourfold.

We will establish stability conditions and wall-crossing in derived categories as a standard methodology for a wide range of fundamental problems in algebraic geometry. Previous work based on wall-crossing, in particular my joint work with Macri, has led to breakthroughs on the birational geometry of moduli spaces and related varieties. Recent advances have made clear that the power of stability conditions extends far beyond this setting, allowing us to study vanishing theorems or bounds on global sections, Brill-Noether problems, or moduli spaces of varieties.
The Brill-Noether problem is one of the oldest and most fundamental questions of algebraic geometry, aiming to classify possible degrees and embedding dimensions of embeddings of a given variety into projective spaces. Recent work by myself, a post-doc (Chunyi Li) and a PhD student (Feyzbakhsh) of mine has established wall-crossing as a powerful new method for such questions. We will push this method further, all the way towards a proof of Green's conjecture, and the Green-Lazarsfeld conjecture, for all smooth curves.
We will use similar methods to prove new Bogomolov-Gieseker type inequalities for Chern classes of stable sheaves and complexes on higher-dimensional varieties. In addition to constructing stability conditions on projective threefolds---the biggest open problem within the theory of stability conditions, we will apply them to study moduli spaces of sheaves on higher-dimensional varieties, and to characterise special abelian varieties.
We will use the construction of stability conditions for families of varieties in my current joint work to systematically study the geometry of Fano threefolds and fourfolds, in particular their moduli spaces, by establishing relations between different moduli spaces, and describing their Torelli maps. Finally, we will study rationality questions, with a particular view towards a wall-crossing proof of the irrationality of the very general cubic fourfold.