SummaryQuantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.

Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.

Max ERC Funding

2 017 932 €

Duration

Start date: 2011-03-01, End date: 2016-02-29

Project acronymANISOTROPIC UNIVERSE

ProjectThe anisotropic universe -- a reality or fluke?

Researcher (PI)Hans Kristian Kamfjord Eriksen

Host Institution (HI)UNIVERSITETET I OSLO

Call DetailsStarting Grant (StG), PE9, ERC-2010-StG_20091028

Summary"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."

"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."

SummaryThe detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.

The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.

Max ERC Funding

1 999 205 €

Duration

Start date: 2018-04-01, End date: 2023-03-31

Project acronymBIVAQUM

ProjectBivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry

Researcher (PI)Simen Kvaal

Host Institution (HI)UNIVERSITETET I OSLO

Call DetailsStarting Grant (StG), PE4, ERC-2014-STG

SummaryThe standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.

The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.

Max ERC Funding

1 499 572 €

Duration

Start date: 2015-04-01, End date: 2020-03-31

Project acronymCHROMPHYS

ProjectPhysics of the Solar Chromosphere

Researcher (PI)Mats Per-Olof Carlsson

Host Institution (HI)UNIVERSITETET I OSLO

Call DetailsAdvanced Grant (AdG), PE9, ERC-2011-ADG_20110209

SummaryCHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.

CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.

Max ERC Funding

2 487 600 €

Duration

Start date: 2012-01-01, End date: 2016-12-31

Project acronymCosmoglobe

ProjectCosmoglobe -- mapping the universe from the Milky Way to the Big Bang

Researcher (PI)Ingunn Kathrine WEHUS

Host Institution (HI)UNIVERSITETET I OSLO

Call DetailsConsolidator Grant (CoG), PE9, ERC-2018-COG

SummaryIn the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.

In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.

Max ERC Funding

1 999 382 €

Duration

Start date: 2019-06-01, End date: 2024-05-31

Project acronymEvoConBiO

ProjectUncovering and engineering the principles governing evolution and cellular control of bioenergetic organelles

Researcher (PI)Iain JOHNSTON

Host Institution (HI)UNIVERSITETET I BERGEN

Call DetailsStarting Grant (StG), LS8, ERC-2018-STG

Summary"Complex life on Earth is powered by bioenergetic organelles -- mitochondria and chloroplasts. Originally independent organisms, these organelles have retained their own genomes (mtDNA and cpDNA), which have been dramatically reduced through evolutionary history. Organelle genomes form dynamic populations within present-day eukaryotic cells, akin to individuals co-evolving in a ""cellular ecosystem"". The structure of these populations is central to eukaryotic life. However, the processes shaping the content of these genomes through history, and maintaining their integrity in modern organisms, are poorly understood. This challenges our understanding of eukaryotic evolution and our ability to design rational strategies to engineer bioenergetic performance.
EvoConBiO will address these questions using a unique and unprecedented interdisciplinary approach, combining experimental characterisation and manipulation of organelle genomes with mathematical modelling and cutting-edge statistics. This highly novel combination of experiment and theory will drive the field in a new direction, for the first time uncovering the universal principles underlying the evolution and cellular control of mitochondria and chloroplasts. Our groundbreaking recent work on mtDNA suggests a common tension underlying organelle evolution, between genetic robustness (transferring genes to the nucleus) and the control and maintenance of organelles (retaining genes in organelles). EvoConBiO will reveal the pathways underlying organelle evolution, why organisms adapt to different points on these pathways, and how they resolve this underlying tension. In addition to these ""blue sky"" scientific insights into a process of central evolutionary importance, we will harness our findings to ""learn from evolution"" in high-risk high-reward development of new experimental strategies to engineer chloroplast performance in plants and algae of importance in EU agriculture, biofuel production, and bioengineering."

"Complex life on Earth is powered by bioenergetic organelles -- mitochondria and chloroplasts. Originally independent organisms, these organelles have retained their own genomes (mtDNA and cpDNA), which have been dramatically reduced through evolutionary history. Organelle genomes form dynamic populations within present-day eukaryotic cells, akin to individuals co-evolving in a ""cellular ecosystem"". The structure of these populations is central to eukaryotic life. However, the processes shaping the content of these genomes through history, and maintaining their integrity in modern organisms, are poorly understood. This challenges our understanding of eukaryotic evolution and our ability to design rational strategies to engineer bioenergetic performance.
EvoConBiO will address these questions using a unique and unprecedented interdisciplinary approach, combining experimental characterisation and manipulation of organelle genomes with mathematical modelling and cutting-edge statistics. This highly novel combination of experiment and theory will drive the field in a new direction, for the first time uncovering the universal principles underlying the evolution and cellular control of mitochondria and chloroplasts. Our groundbreaking recent work on mtDNA suggests a common tension underlying organelle evolution, between genetic robustness (transferring genes to the nucleus) and the control and maintenance of organelles (retaining genes in organelles). EvoConBiO will reveal the pathways underlying organelle evolution, why organisms adapt to different points on these pathways, and how they resolve this underlying tension. In addition to these ""blue sky"" scientific insights into a process of central evolutionary importance, we will harness our findings to ""learn from evolution"" in high-risk high-reward development of new experimental strategies to engineer chloroplast performance in plants and algae of importance in EU agriculture, biofuel production, and bioengineering."

Max ERC Funding

1 417 862 €

Duration

Start date: 2019-07-01, End date: 2024-06-30

Project acronymEVOMESODERM

ProjectThe evolution of mesoderm and its differentiation into cell types and organ systems

Researcher (PI)Andreas Helmut Hejnol

Host Institution (HI)UNIVERSITETET I BERGEN

Call DetailsConsolidator Grant (CoG), LS8, ERC-2014-CoG

SummaryMesoderm, the embryonic germ layer between ectoderm and endoderm, gives rise to major organs within the circulatory and excretory systems and to stabilizing tissues (muscles, bones, connective tissue). Although mesoderm is a key-innovation in evolutionary history, its origin and further diversification into the different organs and cell types of a broad range of animals has not been elucidated. Our knowledge of mesoderm development is mainly based on work performed in prominent model systems including vertebrates (fish, frog and mouse) and invertebrates that are distantly-related and considered to be highly derived (Drosophila and C. elegans). The project proposed herein aims to study mesoderm development in a variety of highly informative animal taxa and trace its differentiation into cell types and organs, with the ultimate aim of reconstructing the history of mesoderm during animal evolution. Our approach combines advanced bioinformatics, live-imaging and molecular methods, and will be carried out in nine representative species belonging to under-investigated animal groups. We will describe the morphological and molecular development of mesoderm in these species, and the differentiation of two important mesodermal cell types: nephridia and blood. Using this information we will be able to infer the embryology and mesodermal cell type composition of ancestors at six important nodes in the animal tree of life. We will also be able to comprehend when shifts in mesoderm development have occurred and how these shifts have remodeled the animal body plans. Further, our implementation of advanced methods in under-studied species will provide new model systems and a more comprehensive framework for further studies in evolutionary developmental biology as well as in other research fields.

Mesoderm, the embryonic germ layer between ectoderm and endoderm, gives rise to major organs within the circulatory and excretory systems and to stabilizing tissues (muscles, bones, connective tissue). Although mesoderm is a key-innovation in evolutionary history, its origin and further diversification into the different organs and cell types of a broad range of animals has not been elucidated. Our knowledge of mesoderm development is mainly based on work performed in prominent model systems including vertebrates (fish, frog and mouse) and invertebrates that are distantly-related and considered to be highly derived (Drosophila and C. elegans). The project proposed herein aims to study mesoderm development in a variety of highly informative animal taxa and trace its differentiation into cell types and organs, with the ultimate aim of reconstructing the history of mesoderm during animal evolution. Our approach combines advanced bioinformatics, live-imaging and molecular methods, and will be carried out in nine representative species belonging to under-investigated animal groups. We will describe the morphological and molecular development of mesoderm in these species, and the differentiation of two important mesodermal cell types: nephridia and blood. Using this information we will be able to infer the embryology and mesodermal cell type composition of ancestors at six important nodes in the animal tree of life. We will also be able to comprehend when shifts in mesoderm development have occurred and how these shifts have remodeled the animal body plans. Further, our implementation of advanced methods in under-studied species will provide new model systems and a more comprehensive framework for further studies in evolutionary developmental biology as well as in other research fields.

SummaryUnderstanding rates of migration and resilience to climate change is important for explaining both the distribution of single species and anticipate how ecosystems may respond to climate change. There are two vigorously debated questions about the response of NW European biota to past climate changes: 1) glacial survival vs tabula rasa and 2) Reid´s paradox of rapid plant migration through seed dispersal vs. survival in cryptic refugia just south or east of the ice sheet. These are related as survival in any northern refugia would suggest local dispersal rather than the rapid dispersal rates that are needed from southern refugia. While we have learned a lot about dispersal routes from phylogeography and about glacial refugia from macrofossils, pollen and, more recently, ancient DNA (aDNA), we have never been able to trace plant migration routes back in time. Our lab is at a step-change in answering these questions as we now have a full genome reference library for the entire flora of Norway and adjacent regions (>2000 species), which will allow us to develop genomic markers identifying not only species, but genetic variation within species, in ancient sediment samples. In addition, we have >20 sediment cores already analysed for vascular plant aDNA using metabarcoding, and a further 20 are in the pipeline. Based on these and 12 new cores, we will select samples that contain key species representing different bioclimatic zones (boreal trees, dwarf shrubs, arctic herbs), and re-analyse them for within-species genetic variation. This will be complemented by analyses of contemporary phylogeography of the same species. This will allow us to identify refugia areas and trace migration routes back in time by different components of the ecosystems. The results of this study will open a new era in studies of species abilities to respond to climate changes (palaeo-phytogeography) and enable us to model the effects of current global warming more accurately than before.

Understanding rates of migration and resilience to climate change is important for explaining both the distribution of single species and anticipate how ecosystems may respond to climate change. There are two vigorously debated questions about the response of NW European biota to past climate changes: 1) glacial survival vs tabula rasa and 2) Reid´s paradox of rapid plant migration through seed dispersal vs. survival in cryptic refugia just south or east of the ice sheet. These are related as survival in any northern refugia would suggest local dispersal rather than the rapid dispersal rates that are needed from southern refugia. While we have learned a lot about dispersal routes from phylogeography and about glacial refugia from macrofossils, pollen and, more recently, ancient DNA (aDNA), we have never been able to trace plant migration routes back in time. Our lab is at a step-change in answering these questions as we now have a full genome reference library for the entire flora of Norway and adjacent regions (>2000 species), which will allow us to develop genomic markers identifying not only species, but genetic variation within species, in ancient sediment samples. In addition, we have >20 sediment cores already analysed for vascular plant aDNA using metabarcoding, and a further 20 are in the pipeline. Based on these and 12 new cores, we will select samples that contain key species representing different bioclimatic zones (boreal trees, dwarf shrubs, arctic herbs), and re-analyse them for within-species genetic variation. This will be complemented by analyses of contemporary phylogeography of the same species. This will allow us to identify refugia areas and trace migration routes back in time by different components of the ecosystems. The results of this study will open a new era in studies of species abilities to respond to climate changes (palaeo-phytogeography) and enable us to model the effects of current global warming more accurately than before.

Max ERC Funding

2 189 776 €

Duration

Start date: 2019-10-01, End date: 2024-09-30

Project acronymmacroevolution.abc

ProjectAbiota, Biota, Constraints in Macroevolutionary Processes

Researcher (PI)Lee Hsiang Liow

Host Institution (HI)UNIVERSITETET I OSLO

Call DetailsConsolidator Grant (CoG), LS8, ERC-2016-COG

SummaryTo what degree do microevolutionary processes that happen on a generational time scale matter for macroevolutionary patterns recorded on time scales of millions of years in the fossil record? To answer this fundamental question in evolutionary biology, we need a model system in which we can overcome the conceptual and empirical boundaries imposed by disparate timescales. macroevolution.abc will develop bryozoans as the Drosophila of macroevolution, integrating molecular, fossil, phenotypic, ecological and environmental data to shed light on the currently inaccessible “Dark Time Scale” (thousands, to tens of thousands of years), spanning the chasm between microevolution studied by population geneticists and evolutionary ecologists and macroevolution studied by paleontologists and comparative phylogeneticists. Using bryozoans, a little-known but uniquely ideal study group for evolutionary questions, I will generate, then cross-integrate, (i) empirical time series of intra- and interspecific biotic interactions; (ii) phenotypic data describing variation within genetic individuals, variation among contemporaneous individuals in both extinct and living populations; (iii) robust estimates of abundance shifts in fossil populations; and (iv) speciation and extinction rate estimates from molecular phylogenies and the fossil record. The new bryozoan model evolutionary system will provide answers to previously intractable questions such as “do ecological interactions crucial for individual survival matter for group diversification patterns observed on geological time scales” and “why do we have to wait a million years for bursts of phenotypic change”?

To what degree do microevolutionary processes that happen on a generational time scale matter for macroevolutionary patterns recorded on time scales of millions of years in the fossil record? To answer this fundamental question in evolutionary biology, we need a model system in which we can overcome the conceptual and empirical boundaries imposed by disparate timescales. macroevolution.abc will develop bryozoans as the Drosophila of macroevolution, integrating molecular, fossil, phenotypic, ecological and environmental data to shed light on the currently inaccessible “Dark Time Scale” (thousands, to tens of thousands of years), spanning the chasm between microevolution studied by population geneticists and evolutionary ecologists and macroevolution studied by paleontologists and comparative phylogeneticists. Using bryozoans, a little-known but uniquely ideal study group for evolutionary questions, I will generate, then cross-integrate, (i) empirical time series of intra- and interspecific biotic interactions; (ii) phenotypic data describing variation within genetic individuals, variation among contemporaneous individuals in both extinct and living populations; (iii) robust estimates of abundance shifts in fossil populations; and (iv) speciation and extinction rate estimates from molecular phylogenies and the fossil record. The new bryozoan model evolutionary system will provide answers to previously intractable questions such as “do ecological interactions crucial for individual survival matter for group diversification patterns observed on geological time scales” and “why do we have to wait a million years for bursts of phenotypic change”?