You are here

ERC FUNDED PROJECTS

ProjectSpintronics based on relativistic phenomena in systems with zero magnetic moment

Researcher (PI)Tomáš Jungwirth

Host Institution (HI)FYZIKALNI USTAV AV CR V.V.I

Call DetailsAdvanced Grant (AdG), PE3, ERC-2010-AdG_20100224

SummaryThe 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.

The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.

SummaryNeandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.

Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.

Max ERC Funding

2 350 000 €

Duration

Start date: 2016-11-01, End date: 2021-10-31

Project acronym14Constraint

ProjectRadiocarbon constraints for models of C cycling in terrestrial ecosystems: from process understanding to global benchmarking

SummaryThe overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.

The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.

Max ERC Funding

2 283 747 €

Duration

Start date: 2016-12-01, End date: 2021-11-30

Project acronym15CBOOKTRADE

ProjectThe 15th-century Book Trade: An Evidence-based Assessment and Visualization of the Distribution, Sale, and Reception of Books in the Renaissance

Researcher (PI)Cristina Dondi

Host Institution (HI)THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD

Call DetailsConsolidator Grant (CoG), SH6, ERC-2013-CoG

SummaryThe idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.

The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.

Max ERC Funding

1 999 172 €

Duration

Start date: 2014-04-01, End date: 2019-03-31

Project acronym19TH-CENTURY_EUCLID

ProjectNineteenth-Century Euclid: Geometry and the Literary Imagination from Wordsworth to Wells

Researcher (PI)Alice Jenkins

Host Institution (HI)UNIVERSITY OF GLASGOW

Call DetailsStarting Grant (StG), SH4, ERC-2007-StG

SummaryThis radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.

This radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.

Max ERC Funding

323 118 €

Duration

Start date: 2009-01-01, End date: 2011-10-31

Project acronym1D-Engine

Project1D-electrons coupled to dissipation: a novel approach for understanding and engineering superconducting materials and devices

Researcher (PI)Adrian KANTIAN

Host Institution (HI)UPPSALA UNIVERSITET

Call DetailsStarting Grant (StG), PE3, ERC-2017-STG

SummaryCorrelated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.

Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.

Max ERC Funding

1 491 013 €

Duration

Start date: 2018-10-01, End date: 2023-09-30

Project acronym1st-principles-discs

ProjectA First Principles Approach to Accretion Discs

Researcher (PI)Martin Elias Pessah

Host Institution (HI)KOBENHAVNS UNIVERSITET

Call DetailsStarting Grant (StG), PE9, ERC-2012-StG_20111012

SummaryMost celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.

Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.

Max ERC Funding

1 793 697 €

Duration

Start date: 2013-02-01, End date: 2018-01-31

Project acronym1stProposal

ProjectAn alternative development of analytic number theory and applications

Researcher (PI)ANDREW Granville

Host Institution (HI)UNIVERSITY COLLEGE LONDON

Call DetailsAdvanced Grant (AdG), PE1, ERC-2014-ADG

SummaryThe traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

Max ERC Funding

2 011 742 €

Duration

Start date: 2015-08-01, End date: 2020-07-31

Project acronym1toStopVax

ProjectRNA virus attenuation by altering mutational robustness

Researcher (PI)Marco VIGNUZZI

Host Institution (HI)INSTITUT PASTEUR

Call DetailsProof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC

SummaryRNA viruses have extreme mutation frequencies. When a RNA virus replicates, nucleotide mutations are generated resulting in a population of variants. This genetic diversity creates a cloud of mutations that are potentially beneficial to viral survival, but the majority of mutations are detrimental to the virus. By increasing the mutation rate of a RNA virus, viral fitness is reduced because it generates more errors, and attenuates the virus during in vivo infection. Another feature that affects RNA virus fitness is mutational robustness. Mutational robustness is the ability to buffer the negative effects of mutation.
The attenuation of RNA viruses for vaccine production faces problems of genetic instability and reversion to a pathogenic phenotype. The conventional method for attenuation is mostly empirical and specific to the particular RNA virus species. Hence, it cannot be universally applied to a variety of virus types. We've developed a non-empirical, rational means of attenuating RNA viruses, targeting mutational robustness as modifiable trait. We demonstrate that mutational robustness of RNA viruses can be modified without changing a virus' physical and biological properties for vaccine production; yet the virus is attenuated as it becomes victim of its naturally high mutation rate. Specifically, the genome of RNA viruses are modified so that a larger proportion of mutations become lethal Stop mutations. Our technology places the virus one step away from these Stop mutations (1-to-Stop). We succeeded in attenuating two RNA viruses from very different viral families, confirming the broad applicability of this approach. These viruses were attenuated in vivo, generated high levels of neutralizing antibody and protected mice from lethal challenge infection.
The proposal now seeks to complete proof of concept studies and develop commercialization strategies to scale up this new technology to preclinical testing with industrial partners.

RNA viruses have extreme mutation frequencies. When a RNA virus replicates, nucleotide mutations are generated resulting in a population of variants. This genetic diversity creates a cloud of mutations that are potentially beneficial to viral survival, but the majority of mutations are detrimental to the virus. By increasing the mutation rate of a RNA virus, viral fitness is reduced because it generates more errors, and attenuates the virus during in vivo infection. Another feature that affects RNA virus fitness is mutational robustness. Mutational robustness is the ability to buffer the negative effects of mutation.
The attenuation of RNA viruses for vaccine production faces problems of genetic instability and reversion to a pathogenic phenotype. The conventional method for attenuation is mostly empirical and specific to the particular RNA virus species. Hence, it cannot be universally applied to a variety of virus types. We've developed a non-empirical, rational means of attenuating RNA viruses, targeting mutational robustness as modifiable trait. We demonstrate that mutational robustness of RNA viruses can be modified without changing a virus' physical and biological properties for vaccine production; yet the virus is attenuated as it becomes victim of its naturally high mutation rate. Specifically, the genome of RNA viruses are modified so that a larger proportion of mutations become lethal Stop mutations. Our technology places the virus one step away from these Stop mutations (1-to-Stop). We succeeded in attenuating two RNA viruses from very different viral families, confirming the broad applicability of this approach. These viruses were attenuated in vivo, generated high levels of neutralizing antibody and protected mice from lethal challenge infection.
The proposal now seeks to complete proof of concept studies and develop commercialization strategies to scale up this new technology to preclinical testing with industrial partners.

Max ERC Funding

150 000 €

Duration

Start date: 2016-09-01, End date: 2018-02-28

Project acronym2-3-AUT

ProjectSurfaces, 3-manifolds and automorphism groups

Researcher (PI)Nathalie Wahl

Host Institution (HI)KOBENHAVNS UNIVERSITET

Call DetailsStarting Grant (StG), PE1, ERC-2009-StG

SummaryThe scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\&apos;e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.

The scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\&apos;e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.