SummaryNeandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.

Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.

Max ERC Funding

2 350 000 €

Duration

Start date: 2016-11-01, End date: 2021-10-31

Project acronym14Constraint

ProjectRadiocarbon constraints for models of C cycling in terrestrial ecosystems: from process understanding to global benchmarking

SummaryThe overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.

The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.

Max ERC Funding

2 283 747 €

Duration

Start date: 2016-12-01, End date: 2021-11-30

Project acronym1stProposal

ProjectAn alternative development of analytic number theory and applications

Researcher (PI)ANDREW Granville

Host Institution (HI)UNIVERSITY COLLEGE LONDON

Call DetailsAdvanced Grant (AdG), PE1, ERC-2014-ADG

SummaryThe traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

SummaryThe project aims to create a demo system for cost effective, non-invasive device for rapid detection of cystic fibrosis in humans. The detection of human recessive diseases has been dominated by the use of fluorescent biomarkers, based on organic dyes, helping researchers to study and analyse gene expression, cell cycle, and enzymatic activity. Among several proteolytic enzymes, trypsin has attracted much attention, as it is a target in the study of various important human recessive diseases including, for example, cystic fibrosis (CF).
We present herein two colour encoded silica nanospheres (2nanoSi) for the fluorescence quantitative ratiometric determination of cystic in humans. Current detection technologies for cystic fibrosis diagnosis are slow, costly and suffer from false positives. The 2nanoSi proved to be a fast (minutes), a single-step and with two times higher sensitivity than the state-of-the-art biomarkers based sensors for cystic fibrosis, allowing the quantification of trypsin concentrations in a wide range (25-350 μg/L). Moreover, our approach can be used from the 4th day of life when the trypsin concentration is already the same as in adults. Furthermore, as trypsin is directly related to the development of cystic fibrosis (CF), different human phenotypes, i.e. normal (160-340 μg/L), CF homozygotic (0-90 μg/L), and CF heterozygotic (91-349 μg/L), respectively, can be determined using our 2nanoSi nanospheres. We anticipate the 2nanoSi system to be a starting point for non-invasive, easy-to-use and cost effective ratiometric fluorescence biomarker for recessive genetic diseases alike human cystic fibrosis.

The project aims to create a demo system for cost effective, non-invasive device for rapid detection of cystic fibrosis in humans. The detection of human recessive diseases has been dominated by the use of fluorescent biomarkers, based on organic dyes, helping researchers to study and analyse gene expression, cell cycle, and enzymatic activity. Among several proteolytic enzymes, trypsin has attracted much attention, as it is a target in the study of various important human recessive diseases including, for example, cystic fibrosis (CF).
We present herein two colour encoded silica nanospheres (2nanoSi) for the fluorescence quantitative ratiometric determination of cystic in humans. Current detection technologies for cystic fibrosis diagnosis are slow, costly and suffer from false positives. The 2nanoSi proved to be a fast (minutes), a single-step and with two times higher sensitivity than the state-of-the-art biomarkers based sensors for cystic fibrosis, allowing the quantification of trypsin concentrations in a wide range (25-350 μg/L). Moreover, our approach can be used from the 4th day of life when the trypsin concentration is already the same as in adults. Furthermore, as trypsin is directly related to the development of cystic fibrosis (CF), different human phenotypes, i.e. normal (160-340 μg/L), CF homozygotic (0-90 μg/L), and CF heterozygotic (91-349 μg/L), respectively, can be determined using our 2nanoSi nanospheres. We anticipate the 2nanoSi system to be a starting point for non-invasive, easy-to-use and cost effective ratiometric fluorescence biomarker for recessive genetic diseases alike human cystic fibrosis.

SummaryFor many years, the ubiquitin-26S proteasome degradation pathway was considered the primary route for proteasomal degradation. However, it is now becoming clear that proteins can also be targeted for degradation by a ubiquitin-independent mechanism mediated by the core 20S proteasome itself. Although initially believed to be limited to rare exceptions, degradation by the 20S proteasome is now understood to have a wide range of substrates, many of which are key regulatory proteins. Despite its importance, little is known about the mechanisms that control 20S proteasomal degradation, unlike the extensive knowledge acquired over the years concerning degradation by the 26S proteasome. Our overall aim is to reveal the multiple regulatory levels that coordinate the 20S proteasome degradation route.
To achieve this goal we will carry out a comprehensive research program characterizing three distinct levels of 20S proteasome regulation:
Intra-molecular regulation- Revealing the intrinsic molecular switch that activates the latent 20S proteasome.
Inter-molecular regulation- Identifying novel proteins that bind the 20S proteasome to regulate its activity and characterizing their mechanism of function.
Cellular regulatory networks- Unraveling the cellular cues and multiple pathways that influence 20S proteasome activity using a novel systematic and unbiased screening approach.
Our experimental strategy involves the combination of biochemical approaches with native mass spectrometry, cross-linking and fluorescence measurements, complemented by cell biology analyses and high-throughput screening. Such a multidisciplinary approach, integrating in vitro and in vivo findings, will likely provide the much needed knowledge on the 20S proteasome degradation route. When completed, we anticipate that this work will be part of a new paradigm – no longer perceiving the 20S proteasome mediated degradation as a simple and passive event but rather a tightly regulated and coordinated process.

For many years, the ubiquitin-26S proteasome degradation pathway was considered the primary route for proteasomal degradation. However, it is now becoming clear that proteins can also be targeted for degradation by a ubiquitin-independent mechanism mediated by the core 20S proteasome itself. Although initially believed to be limited to rare exceptions, degradation by the 20S proteasome is now understood to have a wide range of substrates, many of which are key regulatory proteins. Despite its importance, little is known about the mechanisms that control 20S proteasomal degradation, unlike the extensive knowledge acquired over the years concerning degradation by the 26S proteasome. Our overall aim is to reveal the multiple regulatory levels that coordinate the 20S proteasome degradation route.
To achieve this goal we will carry out a comprehensive research program characterizing three distinct levels of 20S proteasome regulation:
Intra-molecular regulation- Revealing the intrinsic molecular switch that activates the latent 20S proteasome.
Inter-molecular regulation- Identifying novel proteins that bind the 20S proteasome to regulate its activity and characterizing their mechanism of function.
Cellular regulatory networks- Unraveling the cellular cues and multiple pathways that influence 20S proteasome activity using a novel systematic and unbiased screening approach.
Our experimental strategy involves the combination of biochemical approaches with native mass spectrometry, cross-linking and fluorescence measurements, complemented by cell biology analyses and high-throughput screening. Such a multidisciplinary approach, integrating in vitro and in vivo findings, will likely provide the much needed knowledge on the 20S proteasome degradation route. When completed, we anticipate that this work will be part of a new paradigm – no longer perceiving the 20S proteasome mediated degradation as a simple and passive event but rather a tightly regulated and coordinated process.

Max ERC Funding

1 500 000 €

Duration

Start date: 2015-04-01, End date: 2020-03-31

Project acronym2D-CHEM

ProjectTwo-Dimensional Chemistry towards New Graphene Derivatives

Researcher (PI)Michal Otyepka

Host Institution (HI)UNIVERZITA PALACKEHO V OLOMOUCI

Call DetailsConsolidator Grant (CoG), PE5, ERC-2015-CoG

SummaryThe suite of graphene’s unique properties and applications can be enormously enhanced by its functionalization. As non-covalently functionalized graphenes do not target all graphene’s properties and may suffer from limited stability, covalent functionalization represents a promising way for controlling graphene’s properties. To date, only a few well-defined graphene derivatives have been introduced. Among them, fluorographene (FG) stands out as a prominent member because of its easy synthesis and high stability. Being a perfluorinated hydrocarbon, FG was believed to be as unreactive as the two-dimensional counterpart perfluoropolyethylene (Teflon®). However, our recent experiments showed that FG is not chemically inert and can be used as a viable precursor for synthesizing graphene derivatives. This surprising behavior indicates that common textbook grade knowledge cannot blindly be applied to the chemistry of 2D materials. Further, there might be specific rules behind the chemistry of 2D materials, forming a new chemical discipline we tentatively call 2D chemistry. The main aim of the project is to explore, identify and apply the rules of 2D chemistry starting from FG. Using the knowledge gained of 2D chemistry, we will attempt to control the chemistry of various 2D materials aimed at preparing stable graphene derivatives with designed properties, e.g., 1-3 eV band gap, fluorescent properties, sustainable magnetic ordering and dispersability in polar media. The new graphene derivatives will be applied in sensing, imaging, magnetic delivery and catalysis and new emerging applications arising from the synergistic phenomena are expected. We envisage that new applications will be opened up that benefit from the 2D scaffold and tailored properties of the synthesized derivatives. The derivatives will be used for the synthesis of 3D hybrid materials by covalent linking of the 2D sheets joined with other organic and inorganic molecules, nanomaterials or biomacromolecules.

The suite of graphene’s unique properties and applications can be enormously enhanced by its functionalization. As non-covalently functionalized graphenes do not target all graphene’s properties and may suffer from limited stability, covalent functionalization represents a promising way for controlling graphene’s properties. To date, only a few well-defined graphene derivatives have been introduced. Among them, fluorographene (FG) stands out as a prominent member because of its easy synthesis and high stability. Being a perfluorinated hydrocarbon, FG was believed to be as unreactive as the two-dimensional counterpart perfluoropolyethylene (Teflon®). However, our recent experiments showed that FG is not chemically inert and can be used as a viable precursor for synthesizing graphene derivatives. This surprising behavior indicates that common textbook grade knowledge cannot blindly be applied to the chemistry of 2D materials. Further, there might be specific rules behind the chemistry of 2D materials, forming a new chemical discipline we tentatively call 2D chemistry. The main aim of the project is to explore, identify and apply the rules of 2D chemistry starting from FG. Using the knowledge gained of 2D chemistry, we will attempt to control the chemistry of various 2D materials aimed at preparing stable graphene derivatives with designed properties, e.g., 1-3 eV band gap, fluorescent properties, sustainable magnetic ordering and dispersability in polar media. The new graphene derivatives will be applied in sensing, imaging, magnetic delivery and catalysis and new emerging applications arising from the synergistic phenomena are expected. We envisage that new applications will be opened up that benefit from the 2D scaffold and tailored properties of the synthesized derivatives. The derivatives will be used for the synthesis of 3D hybrid materials by covalent linking of the 2D sheets joined with other organic and inorganic molecules, nanomaterials or biomacromolecules.

Max ERC Funding

1 831 103 €

Duration

Start date: 2016-06-01, End date: 2021-05-31

Project acronym2D-Ink

ProjectInk-Jet printed supercapacitors based on 2D nanomaterials.

Researcher (PI)Valeria Nicolosi

Host Institution (HI)THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN

Call DetailsProof of Concept (PoC), PC1, ERC-2014-PoC

SummaryThis proposal will determine the technical-economic viability of scaling-up ultra-thin, ink-jet printed films based on liquid-phase exfoliated single atomic layers of a range of nanomaterials. The PI has developed methods to produce in liquid nanosheets of a range of layered materials such as graphene, transition metal oxides, etc. These 2D-materials have immediate and far-reaching potential in several high-impact technological applications such as microelectronics, composites and energy harvesting and storage. 2DNanoCaps (ERC ref: 278516) has demonstrated that lab-scale ultra-thin graphene-based supercapacitor electrodes result in unusually high-power and extremely long device life-time (100% capacitance retention for 5000 charge-discharge cycles at the high power scan rate of 10,000 mV/s). This performance is an order of magnitude better than similar systems produced with conventional methods which cause materials restacking and aggregation. A following ERC PoC grant (2D-USD, Project-Number 620189) is currently focussed on up-scaling the production of thin-films deposition methods based on ultrasonic spray for the production of large-area electrodes for supercapacitors applications. In this proposal we want to explore the new concept of manufacturing conductive, robust, thin, easily assembled electrode and solid electrolytes to realize highly-flexible and all-solid-state supercapacitors by ink-jet printing. This opportunity is particularly relevant to the electronics and portable-device industry and offers the possibility to solve flammability issues, maintaining light weight, flexibility, transparency and portability. In order to do so it will be imperative to develop ink-jet printing methods and techniques. We believe our combination of unique materials and cost-effective, robust and production-scalable process of ultra- thin ink-jet printing will enable us to compete for significant global market opportunities in the energy-storage space.

This proposal will determine the technical-economic viability of scaling-up ultra-thin, ink-jet printed films based on liquid-phase exfoliated single atomic layers of a range of nanomaterials. The PI has developed methods to produce in liquid nanosheets of a range of layered materials such as graphene, transition metal oxides, etc. These 2D-materials have immediate and far-reaching potential in several high-impact technological applications such as microelectronics, composites and energy harvesting and storage. 2DNanoCaps (ERC ref: 278516) has demonstrated that lab-scale ultra-thin graphene-based supercapacitor electrodes result in unusually high-power and extremely long device life-time (100% capacitance retention for 5000 charge-discharge cycles at the high power scan rate of 10,000 mV/s). This performance is an order of magnitude better than similar systems produced with conventional methods which cause materials restacking and aggregation. A following ERC PoC grant (2D-USD, Project-Number 620189) is currently focussed on up-scaling the production of thin-films deposition methods based on ultrasonic spray for the production of large-area electrodes for supercapacitors applications. In this proposal we want to explore the new concept of manufacturing conductive, robust, thin, easily assembled electrode and solid electrolytes to realize highly-flexible and all-solid-state supercapacitors by ink-jet printing. This opportunity is particularly relevant to the electronics and portable-device industry and offers the possibility to solve flammability issues, maintaining light weight, flexibility, transparency and portability. In order to do so it will be imperative to develop ink-jet printing methods and techniques. We believe our combination of unique materials and cost-effective, robust and production-scalable process of ultra- thin ink-jet printing will enable us to compete for significant global market opportunities in the energy-storage space.

SummarySome 50% of human melanoma tumors have activating mutations in the BRAF gene. BRAF inhibitor drugs given either alone or in combination with MEK inhibitors have improved progression-free and overall survival in patients with BRAF mutant metastatic melanoma. However, drug resistance invariably limits the duration of clinical benefit of such treatments and is almost always associated with re-activation of signaling through the MAP kinase pathway in the presence of drug due to secondary mutations in the pathway. This highlights the urgent need to develop strategies to treat melanomas that have developed resistance to BRAF and/or MEK inhibitors.
As part of an ERC advanced grant, my laboratory has shown that BRAF inhibitor withdrawal in melanomas that have developed resistance to BRAF inhibitors leads to a transient growth arrest that is the consequence of temporary hyperactivation of signaling through the MAP kinase pathway, explaining the so called “drug holiday effect”. We have also found that subsequent treatment of such BRAF inhibitor resistant melanomas with Histone DeACetylase inhibitor drugs (HDACi) leads to persistent hyperactivation of MAP kinase signaling, causing both chronic proliferation arrest and cell death, ultimately leading to complete regression of BRAF-inhibitor resistant melanomas in mice.
We propose here to perform a proof of concept study in at least 10 evaluable melanoma patients that, after proven initial tumor response, have developed resistance to BRAF inhibitors to validate that subsequent treatment of such patients with an HDACi drug will result in durable responses. Translational studies on tumor biopsies taken before, during and after HDACi treatment will be performed to study the cellular effects of HDACi treatment. Our goal is to provide initial proof of concept in patients for use of this sequential BRAFi-HDACi therapy as the treatment of choice for the some 40,000 BRAF mutant melanomas that are diagnosed in the EU annually.

Some 50% of human melanoma tumors have activating mutations in the BRAF gene. BRAF inhibitor drugs given either alone or in combination with MEK inhibitors have improved progression-free and overall survival in patients with BRAF mutant metastatic melanoma. However, drug resistance invariably limits the duration of clinical benefit of such treatments and is almost always associated with re-activation of signaling through the MAP kinase pathway in the presence of drug due to secondary mutations in the pathway. This highlights the urgent need to develop strategies to treat melanomas that have developed resistance to BRAF and/or MEK inhibitors.
As part of an ERC advanced grant, my laboratory has shown that BRAF inhibitor withdrawal in melanomas that have developed resistance to BRAF inhibitors leads to a transient growth arrest that is the consequence of temporary hyperactivation of signaling through the MAP kinase pathway, explaining the so called “drug holiday effect”. We have also found that subsequent treatment of such BRAF inhibitor resistant melanomas with Histone DeACetylase inhibitor drugs (HDACi) leads to persistent hyperactivation of MAP kinase signaling, causing both chronic proliferation arrest and cell death, ultimately leading to complete regression of BRAF-inhibitor resistant melanomas in mice.
We propose here to perform a proof of concept study in at least 10 evaluable melanoma patients that, after proven initial tumor response, have developed resistance to BRAF inhibitors to validate that subsequent treatment of such patients with an HDACi drug will result in durable responses. Translational studies on tumor biopsies taken before, during and after HDACi treatment will be performed to study the cellular effects of HDACi treatment. Our goal is to provide initial proof of concept in patients for use of this sequential BRAFi-HDACi therapy as the treatment of choice for the some 40,000 BRAF mutant melanomas that are diagnosed in the EU annually.

SummaryDespite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.

Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.

Max ERC Funding

2 000 000 €

Duration

Start date: 2015-09-01, End date: 2020-08-31

Project acronym3D-COUNT

Project3D-Integrated single photon detector

Researcher (PI)Fabio SCIARRINO

Host Institution (HI)UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA

Call DetailsProof of Concept (PoC), PC1, ERC-2015-PoC

SummaryPhotonics, in recognition of its strategic significance and pervasiveness throughout many industrial sectors, has been identified as one of the Key Enabling Technologies for Europe. Photonics in combination with quantum information science has great potential to facilitate, transform and innovate future technologies for the better. The Proof of Concept (PoC) project intends to contribute to this by developing and testing a communication platform prototype, comprised of single photon detectors, which are efficiently coupled to single mode fibers using an innovative laser written device. This enables the integration of single photon detectors on innovative glass waveguides. These glass integrated photonic circuits offer excellent specifics for on-chip quantum optics implementations in terms of scattering losses, offering flexibility of the waveguide geometry and ensuring high coupling efficiency with optical fibers.
The device developed and tested in the PoC, directly addresses a market need for an integrated and efficient on-chip communication systems. Current available systems have limitations involving high costs, complex production, and inefficient coupling of detectors to optical fibers. The proposed platform will offer 1.) a simplified production process, 2.) high optical fiber coupling efficiency 3.) improved performance levels, 4.) high cost efficiency, and 5.) compactness. Such systems can be applied in a wide range of communication and non-communication applications, such as free-space optical communication, quantum communication, quantum cryptography, DNA sequencing, single molecule detection and material analysis. Moreover, the future commercialisation of quantum computing is expected to create a vast demand for these communication systems.
In addition to the technology PoC, the project carries out IPR strategy considerations through patenting actions, determines the market potential, seeks market feedback, and plans for post-PoC commercialisation paths.

Photonics, in recognition of its strategic significance and pervasiveness throughout many industrial sectors, has been identified as one of the Key Enabling Technologies for Europe. Photonics in combination with quantum information science has great potential to facilitate, transform and innovate future technologies for the better. The Proof of Concept (PoC) project intends to contribute to this by developing and testing a communication platform prototype, comprised of single photon detectors, which are efficiently coupled to single mode fibers using an innovative laser written device. This enables the integration of single photon detectors on innovative glass waveguides. These glass integrated photonic circuits offer excellent specifics for on-chip quantum optics implementations in terms of scattering losses, offering flexibility of the waveguide geometry and ensuring high coupling efficiency with optical fibers.
The device developed and tested in the PoC, directly addresses a market need for an integrated and efficient on-chip communication systems. Current available systems have limitations involving high costs, complex production, and inefficient coupling of detectors to optical fibers. The proposed platform will offer 1.) a simplified production process, 2.) high optical fiber coupling efficiency 3.) improved performance levels, 4.) high cost efficiency, and 5.) compactness. Such systems can be applied in a wide range of communication and non-communication applications, such as free-space optical communication, quantum communication, quantum cryptography, DNA sequencing, single molecule detection and material analysis. Moreover, the future commercialisation of quantum computing is expected to create a vast demand for these communication systems.
In addition to the technology PoC, the project carries out IPR strategy considerations through patenting actions, determines the market potential, seeks market feedback, and plans for post-PoC commercialisation paths.