Summary"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."

"Background: Osteoarthritis (OA) is a common musculoskeletal disease occurring worldwide. Despite extensive research, etiology of OA is still poorly understood. Histopathological grading (HPG) of 2D tissue sections is the gold standard reference method for determination of OA stage. However, traditional 2D-HPG is destructive and based only on subjective visual evaluation. These limitations induce bias to clinical in vitro OA diagnostics and basic research that both rely strongly on HPG.
Objectives: 1) To establish and validate the very first 3D-HPG of OA based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro; 2) To use the established method to clarify the beginning phases of OA; and 3) To validate 3D-HPG of OA for in vivo use.
Methods: Several hundreds of human osteochondral samples from patients undergoing total knee arthroplasty will be collected. The samples will be imaged in vitro with nano/micro-CT and clinical high-end extremity CT devices using specific contrast-agents to quantify tissue constituents and structure in 3D in large volume. From this information, a novel 3D-HPG is developed with statistical classification algorithms. Finally, the developed novel 3D-HPG of OA will be applied clinically in vivo.
Significance: This is the very first study to establish 3D-HPG of OA pathology in vitro and in vivo. Furthermore, the developed technique hugely improves the understanding of the beginning phases of OA. Ultimately, the study will contribute for improving OA patients’ quality of life by slowing the disease progression, and for providing powerful tools to develop new OA therapies."

SummaryAt the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.

At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.

Max ERC Funding

1 166 231 €

Duration

Start date: 2013-09-01, End date: 2018-08-31

Project acronym3DWATERWAVES

ProjectMathematical aspects of three-dimensional water waves with vorticity

Researcher (PI)Erik Torsten Wahlen

Host Institution (HI)LUNDS UNIVERSITET

Call DetailsStarting Grant (StG), PE1, ERC-2015-STG

SummaryThe goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.

The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.

SummaryCrucial processes within cells depend on specific non-covalent interactions which mediate the assembly of proteins and other biomolecules. Deriving structural information to understand the function of these complex systems is the primary goal of Structural Biology.
In this application, the recently developed LILBID method (Laser Induced Liquid Bead Ion Desorption) will be optimized for investigation of macromolecular complexes with a mass accuracy two orders of magnitude better than in 1st generation spectrometers.
Controlled disassembly of the multiprotein complexes in the mass spectrometric analysis while keeping the 3D structure intact, will allow for the determination of complex stoichiometry and connectivity of the constituting proteins. Methods for such controlled disassembly will be developed in two separate units of the proposed LILBID spectrometer, in a collision chamber and in a laser dissociation chamber, enabling gas phase dissociation of protein complexes and removal of excess water/buffer molecules. As a third unit, a chamber allowing determination of ion mobility (IM) will be integrated to determine collisional cross sections (CCS). From CCS, unique information regarding the spatial arrangement of proteins in complexes or subcomplexes will then be obtainable from LILBID.
The proposed design of the new spectrometer will offer fundamentally new possibilities for the investigation of non-covalent RNA, soluble and membrane protein complexes, as well as broadening the applicability of non-covalent MS towards supercomplexes.

Crucial processes within cells depend on specific non-covalent interactions which mediate the assembly of proteins and other biomolecules. Deriving structural information to understand the function of these complex systems is the primary goal of Structural Biology.
In this application, the recently developed LILBID method (Laser Induced Liquid Bead Ion Desorption) will be optimized for investigation of macromolecular complexes with a mass accuracy two orders of magnitude better than in 1st generation spectrometers.
Controlled disassembly of the multiprotein complexes in the mass spectrometric analysis while keeping the 3D structure intact, will allow for the determination of complex stoichiometry and connectivity of the constituting proteins. Methods for such controlled disassembly will be developed in two separate units of the proposed LILBID spectrometer, in a collision chamber and in a laser dissociation chamber, enabling gas phase dissociation of protein complexes and removal of excess water/buffer molecules. As a third unit, a chamber allowing determination of ion mobility (IM) will be integrated to determine collisional cross sections (CCS). From CCS, unique information regarding the spatial arrangement of proteins in complexes or subcomplexes will then be obtainable from LILBID.
The proposed design of the new spectrometer will offer fundamentally new possibilities for the investigation of non-covalent RNA, soluble and membrane protein complexes, as well as broadening the applicability of non-covalent MS towards supercomplexes.

Max ERC Funding

1 264 477 €

Duration

Start date: 2014-02-01, End date: 2019-01-31

Project acronymAAREA

ProjectThe Archaeology of Agricultural Resilience in Eastern Africa

Researcher (PI)Daryl Stump

Host Institution (HI)UNIVERSITY OF YORK

Call DetailsStarting Grant (StG), SH6, ERC-2013-StG

Summary"The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."

"The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."

Max ERC Funding

1 196 701 €

Duration

Start date: 2014-02-01, End date: 2018-01-31

Project acronymABDESIGN

ProjectComputational design of novel protein function in antibodies

Researcher (PI)Sarel-Jacob Fleishman

Host Institution (HI)WEIZMANN INSTITUTE OF SCIENCE

Call DetailsStarting Grant (StG), LS1, ERC-2013-StG

SummaryWe propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.

We propose to elucidate the structural design principles of naturally occurring antibody complementarity-determining regions (CDRs) and to computationally design novel antibody functions. Antibodies represent the most versatile known system for molecular recognition. Research has yielded many insights into antibody design principles and promising biotechnological and pharmaceutical applications. Still, our understanding of how CDRs encode specific loop conformations lags far behind our understanding of structure-function relationships in non-immunological scaffolds. Thus, design of antibodies from first principles has not been demonstrated. We propose a computational-experimental strategy to address this challenge. We will: (a) characterize the design principles and sequence elements that rigidify antibody CDRs. Natural antibody loops will be subjected to computational modeling, crystallography, and a combined in vitro evolution and deep-sequencing approach to isolate sequence features that rigidify loop backbones; (b) develop a novel computational-design strategy, which uses the >1000 solved structures of antibodies deposited in structure databases to realistically model CDRs and design them to recognize proteins that have not been co-crystallized with antibodies. For example, we will design novel antibodies targeting insulin, for which clinically useful diagnostics are needed. By accessing much larger sequence/structure spaces than are available to natural immune-system repertoires and experimental methods, computational antibody design could produce higher-specificity and higher-affinity binders, even to challenging targets; and (c) develop new strategies to program conformational change in CDRs, generating, e.g., the first allosteric antibodies. These will allow targeting, in principle, of any molecule, potentially revolutionizing how antibodies are generated for research and medicine, providing new insights on the design principles of protein functional sites.

Max ERC Funding

1 499 930 €

Duration

Start date: 2013-09-01, End date: 2018-08-31

Project acronymACDC

ProjectAlgorithms and Complexity of Highly Decentralized Computations

Researcher (PI)Fabian Daniel Kuhn

Host Institution (HI)ALBERT-LUDWIGS-UNIVERSITAET FREIBURG

Call DetailsStarting Grant (StG), PE6, ERC-2013-StG

Summary"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."

"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."

Max ERC Funding

1 148 000 €

Duration

Start date: 2013-11-01, End date: 2018-10-31

Project acronymAcetyLys

ProjectUnravelling the role of lysine acetylation in the regulation of glycolysis in cancer cells through the development of synthetic biology-based tools

Researcher (PI)Eyal Arbely

Host Institution (HI)BEN-GURION UNIVERSITY OF THE NEGEV

Call DetailsStarting Grant (StG), LS9, ERC-2015-STG

SummarySynthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.

Synthetic biology is an emerging discipline that offers powerful tools to control and manipulate fundamental processes in living matter. We propose to develop and apply such tools to modify the genetic code of cultured mammalian cells and bacteria with the aim to study the role of lysine acetylation in the regulation of metabolism and in cancer development. Thousands of lysine acetylation sites were recently discovered on non-histone proteins, suggesting that acetylation is a widespread and evolutionarily conserved post translational modification, similar in scope to phosphorylation and ubiquitination. Specifically, it has been found that most of the enzymes of metabolic processes—including glycolysis—are acetylated, implying that acetylation is key regulator of cellular metabolism in general and in glycolysis in particular. The regulation of metabolic pathways is of particular importance to cancer research, as misregulation of metabolic pathways, especially upregulation of glycolysis, is common to most transformed cells and is now considered a new hallmark of cancer. These data raise an immediate question: what is the role of acetylation in the regulation of glycolysis and in the metabolic reprogramming of cancer cells? While current methods rely on mutational analyses, we will genetically encode the incorporation of acetylated lysine and directly measure the functional role of each acetylation site in cancerous and non-cancerous cell lines. Using this methodology, we will study the structural and functional implications of all the acetylation sites in glycolytic enzymes. We will also decipher the mechanism by which acetylation is regulated by deacetylases and answer a long standing question – how 18 deacetylases recognise their substrates among thousands of acetylated proteins? The developed methodologies can be applied to a wide range of protein families known to be acetylated, thereby making this study relevant to diverse research fields.

Max ERC Funding

1 499 375 €

Duration

Start date: 2016-07-01, End date: 2021-06-30

Project acronymACO

ProjectThe Proceedings of the Ecumenical Councils from Oral Utterance to Manuscript Edition as Evidence for Late Antique Persuasion and Self-Representation Techniques

Researcher (PI)Peter Alfred Riedlberger

Host Institution (HI)OTTO-FRIEDRICH-UNIVERSITAET BAMBERG

Call DetailsStarting Grant (StG), SH5, ERC-2015-STG

SummaryThe Acts of the Ecumenical Councils of Late Antiquity include (purportedly) verbatim minutes of the proceedings, a formal framework and copies of relevant documents which were either (allegedly) read out during the proceedings or which were later attached to the Acts proper. Despite this unusual wealth of documentary evidence, the daunting nature of the Acts demanding multidisciplinary competency, their complex structure with a matryoshka-like nesting of proceedings from different dates, and the stereotype that their contents bear only on Christological niceties have deterred generations of historians from studying them. Only in recent years have their fortunes begun to improve, but this recent research has not always been based on sound principles: the recorded proceedings of the sessions are still often accepted as verbatim minutes. Yet even a superficial reading quickly reveals widespread editorial interference. We must accept that in many cases the Acts will teach us less about the actual debates than about the editors who shaped their presentation. This does not depreciate the Acts’ evidence: on the contrary, they are first-rate material for the rhetoric of persuasion and self-representation. It is possible, in fact, to take the investigation to a deeper level and examine in what manner the oral proceedings were put into writing: several passages in the Acts comment upon the process of note-taking and the work of the shorthand writers. Thus, the main objective of the proposed research project could be described as an attempt to trace the destinies of the Acts’ texts, from the oral utterance to the manuscript texts we have today. This will include the fullest study on ancient transcript techniques to date; a structural analysis of the Acts’ texts with the aim of highlighting edited passages; and a careful comparison of the various editions of the Acts, which survive in Greek, Latin, Syriac and Coptic, in order to detect traces of editorial interference.

The Acts of the Ecumenical Councils of Late Antiquity include (purportedly) verbatim minutes of the proceedings, a formal framework and copies of relevant documents which were either (allegedly) read out during the proceedings or which were later attached to the Acts proper. Despite this unusual wealth of documentary evidence, the daunting nature of the Acts demanding multidisciplinary competency, their complex structure with a matryoshka-like nesting of proceedings from different dates, and the stereotype that their contents bear only on Christological niceties have deterred generations of historians from studying them. Only in recent years have their fortunes begun to improve, but this recent research has not always been based on sound principles: the recorded proceedings of the sessions are still often accepted as verbatim minutes. Yet even a superficial reading quickly reveals widespread editorial interference. We must accept that in many cases the Acts will teach us less about the actual debates than about the editors who shaped their presentation. This does not depreciate the Acts’ evidence: on the contrary, they are first-rate material for the rhetoric of persuasion and self-representation. It is possible, in fact, to take the investigation to a deeper level and examine in what manner the oral proceedings were put into writing: several passages in the Acts comment upon the process of note-taking and the work of the shorthand writers. Thus, the main objective of the proposed research project could be described as an attempt to trace the destinies of the Acts’ texts, from the oral utterance to the manuscript texts we have today. This will include the fullest study on ancient transcript techniques to date; a structural analysis of the Acts’ texts with the aim of highlighting edited passages; and a careful comparison of the various editions of the Acts, which survive in Greek, Latin, Syriac and Coptic, in order to detect traces of editorial interference.

Max ERC Funding

1 497 250 €

Duration

Start date: 2016-05-01, End date: 2021-04-30

Project acronymACTAR TPC

ProjectActive Target and Time Projection Chamber

Researcher (PI)Geoffrey Fathom Grinyer

Host Institution (HI)GRAND ACCELERATEUR NATIONAL D'IONS LOURDS

Call DetailsStarting Grant (StG), PE2, ERC-2013-StG

SummaryThe active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.

The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.