ProjectAn alternative development of analytic number theory and applications

Researcher (PI)ANDREW Granville

Host Institution (HI)UNIVERSITY COLLEGE LONDON

Call DetailsAdvanced Grant (AdG), PE1, ERC-2014-ADG

SummaryThe traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.

Max ERC Funding

2 011 742 €

Duration

Start date: 2015-08-01, End date: 2020-07-31

Project acronym2SEXES_1GENOME

ProjectSex-specific genetic effects on fitness and human disease

Researcher (PI)Edward Hugh Morrow

Host Institution (HI)THE UNIVERSITY OF SUSSEX

Call DetailsStarting Grant (StG), LS8, ERC-2011-StG_20101109

SummaryDarwin’s theory of natural selection rests on the principle that fitness variation in natural populations has a heritable component, on which selection acts, thereby leading to evolutionary change. A fundamental and so far unresolved question for the field of evolutionary biology is to identify the genetic loci responsible for this fitness variation, thereby coming closer to an understanding of how variation is maintained in the face of continual selection. One important complicating factor in the search for fitness related genes however is the existence of separate sexes – theoretical expectations and empirical data both suggest that sexually antagonistic genes are common. The phrase “two sexes, one genome” nicely sums up the problem; selection may favour alleles in one sex, even if they have detrimental effects on the fitness of the opposite sex, since it is their net effect across both sexes that determine the likelihood that alleles persist in a population. This theoretical framework raises an interesting, and so far entirely unexplored issue: that in one sex the functional performance of some alleles is predicted to be compromised and this effect may account for some common human diseases and conditions which show genotype-sex interactions. I propose to explore the genetic basis of sex-specific fitness in a model organism in both laboratory and natural conditions and to test whether those genes identified as having sexually antagonistic effects can help explain the incidence of human diseases that display sexual dimorphism in prevalence, age of onset or severity. This multidisciplinary project directly addresses some fundamental unresolved questions in evolutionary biology: the genetic basis and maintenance of fitness variation; the evolution of sexual dimorphism; and aims to provide novel insights into the genetic basis of some common human diseases.

Darwin’s theory of natural selection rests on the principle that fitness variation in natural populations has a heritable component, on which selection acts, thereby leading to evolutionary change. A fundamental and so far unresolved question for the field of evolutionary biology is to identify the genetic loci responsible for this fitness variation, thereby coming closer to an understanding of how variation is maintained in the face of continual selection. One important complicating factor in the search for fitness related genes however is the existence of separate sexes – theoretical expectations and empirical data both suggest that sexually antagonistic genes are common. The phrase “two sexes, one genome” nicely sums up the problem; selection may favour alleles in one sex, even if they have detrimental effects on the fitness of the opposite sex, since it is their net effect across both sexes that determine the likelihood that alleles persist in a population. This theoretical framework raises an interesting, and so far entirely unexplored issue: that in one sex the functional performance of some alleles is predicted to be compromised and this effect may account for some common human diseases and conditions which show genotype-sex interactions. I propose to explore the genetic basis of sex-specific fitness in a model organism in both laboratory and natural conditions and to test whether those genes identified as having sexually antagonistic effects can help explain the incidence of human diseases that display sexual dimorphism in prevalence, age of onset or severity. This multidisciplinary project directly addresses some fundamental unresolved questions in evolutionary biology: the genetic basis and maintenance of fitness variation; the evolution of sexual dimorphism; and aims to provide novel insights into the genetic basis of some common human diseases.

SummaryDespite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.

Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.

Max ERC Funding

2 000 000 €

Duration

Start date: 2015-09-01, End date: 2020-08-31

Project acronym3D-JOINT

Project3D Bioprinting of JOINT Replacements

Researcher (PI)Johannes Jos Malda

Host Institution (HI)UNIVERSITAIR MEDISCH CENTRUM UTRECHT

Call DetailsConsolidator Grant (CoG), LS7, ERC-2014-CoG

SummaryThe world has a significant medical challenge in repairing injured or diseased joints. Joint degeneration and its related pain is a major socio-economic burden that will increase over the next decade and is currently addressed by implanting a metal prosthesis. For the long term, the ideal solution to joint injury is to successfully regenerate rather than replace the damaged cartilage with synthetic implants. Recent advances in key technologies are now bringing this “holy grail” within reach; regenerative approaches, based on cell therapy, are already clinically available albeit only for smaller focal cartilage defects.
One of these key technologies is three-dimensional (3D) bio-printing, which provides a greatly controlled placement and organization of living constructs through the layer-by-layer deposition of materials and cells. These tissue constructs can be applied as tissue models for research and screening. However, the lack of biomechanical properties of these tissue constructs has hampered their application to the regeneration of damaged, degenerated or diseased tissue.
Having established a cartilage-focussed research laboratory in the University Medical Center Utrecht, I have addressed this biomechanical limitation of hydrogels through the use of hydrogel composites. Specifically, I have pioneered a 3D bio-printing technology that combines accurately printed small diameter thermoplast filaments with cell invasive hydrogels to form strong fibre-reinforced constructs. This, in combination with bioreactor technology, is the key to the generation of larger, complex tissue constructs with cartilage-like biomechanical resilience. With 3D-JOINT I will use my in-depth bio-printing and bioreactor knowledge and experience to develop a multi-phasic 3D-printed biological replacement of the joint.

The world has a significant medical challenge in repairing injured or diseased joints. Joint degeneration and its related pain is a major socio-economic burden that will increase over the next decade and is currently addressed by implanting a metal prosthesis. For the long term, the ideal solution to joint injury is to successfully regenerate rather than replace the damaged cartilage with synthetic implants. Recent advances in key technologies are now bringing this “holy grail” within reach; regenerative approaches, based on cell therapy, are already clinically available albeit only for smaller focal cartilage defects.
One of these key technologies is three-dimensional (3D) bio-printing, which provides a greatly controlled placement and organization of living constructs through the layer-by-layer deposition of materials and cells. These tissue constructs can be applied as tissue models for research and screening. However, the lack of biomechanical properties of these tissue constructs has hampered their application to the regeneration of damaged, degenerated or diseased tissue.
Having established a cartilage-focussed research laboratory in the University Medical Center Utrecht, I have addressed this biomechanical limitation of hydrogels through the use of hydrogel composites. Specifically, I have pioneered a 3D bio-printing technology that combines accurately printed small diameter thermoplast filaments with cell invasive hydrogels to form strong fibre-reinforced constructs. This, in combination with bioreactor technology, is the key to the generation of larger, complex tissue constructs with cartilage-like biomechanical resilience. With 3D-JOINT I will use my in-depth bio-printing and bioreactor knowledge and experience to develop a multi-phasic 3D-printed biological replacement of the joint.

Max ERC Funding

1 998 871 €

Duration

Start date: 2015-07-01, End date: 2020-06-30

Project acronym3D-OA-HISTO

ProjectDevelopment of 3D Histopathological Grading of Osteoarthritis

Researcher (PI)Simo Jaakko Saarakkala

Host Institution (HI)OULUN YLIOPISTO

Call DetailsStarting Grant (StG), LS7, ERC-2013-StG

Summary"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."

"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."

Max ERC Funding

1 500 000 €

Duration

Start date: 2014-02-01, End date: 2019-01-31

Project acronym5HTCircuits

ProjectModulation of cortical circuits and predictive neural coding by serotonin

SummarySerotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.

Serotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.

Max ERC Funding

2 486 074 €

Duration

Start date: 2016-01-01, End date: 2020-12-31

Project acronymA-DIET

ProjectMetabolomics based biomarkers of dietary intake- new tools for nutrition research

Researcher (PI)Lorraine Brennan

Host Institution (HI)UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN

Call DetailsConsolidator Grant (CoG), LS7, ERC-2014-CoG

SummaryIn todays advanced technological world, we can track the exact movement of individuals, analyse their genetic makeup and predict predisposition to certain diseases. However, we are unable to accurately assess an individual’s dietary intake. This is without a doubt one of the main stumbling blocks in assessing the link between diet and disease/health. The present proposal (A-DIET) will address this issue with the overarching objective to develop novel strategies for assessment of dietary intake.
Using approaches to (1) identify biomarkers of specific foods (2) classify people into dietary patterns (nutritypes) and (3) develop a tool for integration of dietary and biomarker data, A-DIET has the potential to dramatically enhance our ability to accurately assess dietary intake. The ultimate output from A-DIET will be a dietary assessment tool which can be used to obtain an accurate assessment of dietary intake by combining dietary and biomarker data which in turn will allow investigations into relationships between diet, health and disease. New biomarkers of specific foods will be identified and validated using intervention studies and metabolomic analyses. Methods will be developed to classify individuals into dietary patterns based on biomarker/metabolomic profiles thus demonstrating the novel concept of nutritypes. Strategies for integration of dietary and biomarker data will be developed and translated into a tool that will be made available to the wider scientific community.
Advances made in A-DIET will enable nutrition epidemiologist’s to properly examine the relationship between diet and disease and develop clear public health messages with regard to diet and health. Additionally results from A-DIET will allow researchers to accurately assess people’s diet and implement health promotion strategies and enable dieticians in a clinical environment to assess compliance to therapeutic diets such as adherence to a high fibre diet or a gluten free diet.

In todays advanced technological world, we can track the exact movement of individuals, analyse their genetic makeup and predict predisposition to certain diseases. However, we are unable to accurately assess an individual’s dietary intake. This is without a doubt one of the main stumbling blocks in assessing the link between diet and disease/health. The present proposal (A-DIET) will address this issue with the overarching objective to develop novel strategies for assessment of dietary intake.
Using approaches to (1) identify biomarkers of specific foods (2) classify people into dietary patterns (nutritypes) and (3) develop a tool for integration of dietary and biomarker data, A-DIET has the potential to dramatically enhance our ability to accurately assess dietary intake. The ultimate output from A-DIET will be a dietary assessment tool which can be used to obtain an accurate assessment of dietary intake by combining dietary and biomarker data which in turn will allow investigations into relationships between diet, health and disease. New biomarkers of specific foods will be identified and validated using intervention studies and metabolomic analyses. Methods will be developed to classify individuals into dietary patterns based on biomarker/metabolomic profiles thus demonstrating the novel concept of nutritypes. Strategies for integration of dietary and biomarker data will be developed and translated into a tool that will be made available to the wider scientific community.
Advances made in A-DIET will enable nutrition epidemiologist’s to properly examine the relationship between diet and disease and develop clear public health messages with regard to diet and health. Additionally results from A-DIET will allow researchers to accurately assess people’s diet and implement health promotion strategies and enable dieticians in a clinical environment to assess compliance to therapeutic diets such as adherence to a high fibre diet or a gluten free diet.

Summary"I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."

"I propose an ambitious, yet feasible 5-year research project that will fill an important gap in global health. Specifically, I will develop and validate novel approaches for anthelmintic drug discovery and development. My proposal pursues the following five research questions: (i) Is a chip calorimeter suitable for high-throughput screening in anthelmintic drug discovery? (ii) Is combination chemotherapy safe and more efficacious than monotherapy against strongyloidiasis and trichuriasis? (iii) What are the key pharmacokinetic parameters of praziquantel in preschool-aged children and school-aged children infected with Schistosoma mansoni and S. haematobium using a novel and validated technology based on dried blood spotting? (iv) What are the metabolic consequences and clearance of praziquantel treatment in S. mansoni-infected mice and S. mansoni- and S. haematobium-infected children? (v) Which is the ideal compartment to study pharmacokinetic parameters for intestinal nematode infections and does age, nutrition, co-infection and infection intensity influence the efficacy of anthelmintic drugs?
My proposed research is of considerable public health relevance since it will ultimately result in improved treatments for soil-transmitted helminthiasis and pediatric schistosomiasis. Additionally, at the end of this project, I have generated comprehensive information on drug disposition of anthelmintics. A comprehensive database of metabolite profiles following praziquantel treatment will be available. Finally, the proof-of-concept of chip calorimetry in anthelmintic drug discovery has been established and broadly validated."

SummaryAerosols (i.e. tiny particles suspended in the air) are regularly transported in huge amounts over long distances impacting air quality, health, weather and climate thousands of kilometers downwind of the source. Aerosols affect the atmospheric radiation budget through scattering and absorption of solar radiation and through their role as cloud/ice nuclei.
In particular, light absorption by aerosol particles such as mineral dust and black carbon (BC; thought to be the second strongest contribution to current global warming after CO2) is of fundamental importance from a climate perspective because the presence of absorbing particles (1) contributes to solar radiative forcing, (2) heats absorbing aerosol layers, (3) can evaporate clouds and (4) change atmospheric dynamics.
Considering this prominent role of aerosols, vertically-resolved in-situ data on absorbing aerosols are surprisingly scarce and aerosol-dynamic interactions are poorly understood in general. This is, as recognized in the last IPCC report, a serious barrier for taking the accuracy of climate models and predictions to the next level. To overcome this barrier, I propose to investigate aging, lifetime and dynamics of absorbing aerosol layers with a holistic end-to-end approach including laboratory studies, airborne field experiments and numerical model simulations.
Building on the internationally recognized results of my aerosol research group and my long-term experience with airborne aerosol measurements, the time seems ripe to systematically bridge the gap between in-situ measurements of aerosol microphysical and optical properties and the assessment of dynamical interactions of absorbing particles with aerosol layer lifetime through model simulations.
The outcomes of this project will provide fundamental new understanding of absorbing aerosol layers in the climate system and important information for addressing the benefits of BC emission controls for mitigating climate change.

Aerosols (i.e. tiny particles suspended in the air) are regularly transported in huge amounts over long distances impacting air quality, health, weather and climate thousands of kilometers downwind of the source. Aerosols affect the atmospheric radiation budget through scattering and absorption of solar radiation and through their role as cloud/ice nuclei.
In particular, light absorption by aerosol particles such as mineral dust and black carbon (BC; thought to be the second strongest contribution to current global warming after CO2) is of fundamental importance from a climate perspective because the presence of absorbing particles (1) contributes to solar radiative forcing, (2) heats absorbing aerosol layers, (3) can evaporate clouds and (4) change atmospheric dynamics.
Considering this prominent role of aerosols, vertically-resolved in-situ data on absorbing aerosols are surprisingly scarce and aerosol-dynamic interactions are poorly understood in general. This is, as recognized in the last IPCC report, a serious barrier for taking the accuracy of climate models and predictions to the next level. To overcome this barrier, I propose to investigate aging, lifetime and dynamics of absorbing aerosol layers with a holistic end-to-end approach including laboratory studies, airborne field experiments and numerical model simulations.
Building on the internationally recognized results of my aerosol research group and my long-term experience with airborne aerosol measurements, the time seems ripe to systematically bridge the gap between in-situ measurements of aerosol microphysical and optical properties and the assessment of dynamical interactions of absorbing particles with aerosol layer lifetime through model simulations.
The outcomes of this project will provide fundamental new understanding of absorbing aerosol layers in the climate system and important information for addressing the benefits of BC emission controls for mitigating climate change.

Summary"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."

"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."