ProjectEvolution of Alzheimer’s Disease: From dynamics of single synapses to memory loss

Researcher (PI)Inna Slutsky

Host Institution (HI)TEL AVIV UNIVERSITY

Call DetailsStarting Grant (StG), LS5, ERC-2011-StG_20101109

SummaryA persistent challenge in unravelling mechanisms that regulate memory function is how to bridge the gap between inter-molecular dynamics of single proteins, activity of individual synapses and emerging properties of neuronal circuits. The prototype condition of disintegrating neuronal circuits is Alzheimer’s Disease (AD). Since the early time of Alois Alzheimer at the turn of the 20th century, scientists have been searching for a molecular entity that is in the roots of the cognitive deficits. Although diverse lines of evidence suggest that the amyloid-beta peptide (Abeta) plays a central role in synaptic dysfunctions of AD, several key questions remain unresolved. First, endogenous Abeta peptides are secreted by neurons throughout life, but their physiological functions are largely unknown. Second, experience-dependent physiological mechanisms that initiate the changes in Abeta composition in sporadic, the most frequent form of AD, are unidentified. And finally, molecular mechanisms that trigger Abeta-induced synaptic failure and memory decline remain elusive.
To target these questions, I propose to develop an integrative approach to correlate structure and function at the level of single synapses in hippocampal circuits. State-of-the-art techniques will enable the simultaneous real-time visualization of inter-molecular dynamics within signalling complexes and functional synaptic modifications. Utilizing FRET spectroscopy, high-resolution optical imaging, electrophysiology, molecular biology and biochemistry we will determine the casual relationship between ongoing neuronal activity, temporo-spatial dynamics and molecular composition of Abeta, structural rearrangements within the Abeta signalling complexes and plasticity of single synapses and whole networks. The proposed research will elucidate fundamental principles of neuronal circuits function and identify critical steps that initiate primary synaptic dysfunctions at the very early stages of sporadic AD.

A persistent challenge in unravelling mechanisms that regulate memory function is how to bridge the gap between inter-molecular dynamics of single proteins, activity of individual synapses and emerging properties of neuronal circuits. The prototype condition of disintegrating neuronal circuits is Alzheimer’s Disease (AD). Since the early time of Alois Alzheimer at the turn of the 20th century, scientists have been searching for a molecular entity that is in the roots of the cognitive deficits. Although diverse lines of evidence suggest that the amyloid-beta peptide (Abeta) plays a central role in synaptic dysfunctions of AD, several key questions remain unresolved. First, endogenous Abeta peptides are secreted by neurons throughout life, but their physiological functions are largely unknown. Second, experience-dependent physiological mechanisms that initiate the changes in Abeta composition in sporadic, the most frequent form of AD, are unidentified. And finally, molecular mechanisms that trigger Abeta-induced synaptic failure and memory decline remain elusive.
To target these questions, I propose to develop an integrative approach to correlate structure and function at the level of single synapses in hippocampal circuits. State-of-the-art techniques will enable the simultaneous real-time visualization of inter-molecular dynamics within signalling complexes and functional synaptic modifications. Utilizing FRET spectroscopy, high-resolution optical imaging, electrophysiology, molecular biology and biochemistry we will determine the casual relationship between ongoing neuronal activity, temporo-spatial dynamics and molecular composition of Abeta, structural rearrangements within the Abeta signalling complexes and plasticity of single synapses and whole networks. The proposed research will elucidate fundamental principles of neuronal circuits function and identify critical steps that initiate primary synaptic dysfunctions at the very early stages of sporadic AD.

Max ERC Funding

2 000 000 €

Duration

Start date: 2011-12-01, End date: 2017-09-30

Project acronymALGILE

ProjectFoundations of Algebraic and Dynamic Data Management Systems

Researcher (PI)Christoph Koch

Host Institution (HI)ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE

Call DetailsStarting Grant (StG), PE6, ERC-2011-StG_20101014

Summary"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."

"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."

SummaryComputer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.

Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.

Summary"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."

"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."

Max ERC Funding

1 453 507 €

Duration

Start date: 2012-01-01, End date: 2016-12-31

Project acronymASAP

ProjectAdaptive Security and Privacy

Researcher (PI)Bashar Nuseibeh

Host Institution (HI)THE OPEN UNIVERSITY

Call DetailsAdvanced Grant (AdG), PE6, ERC-2011-ADG_20110209

SummaryWith the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.

With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.

Max ERC Funding

2 499 041 €

Duration

Start date: 2012-10-01, End date: 2018-09-30

Project acronymAstroFunc

ProjectMolecular Studies of Astrocyte Function in Health and Disease

Researcher (PI)Matthew Guy Holt

Host Institution (HI)VIB

Call DetailsStarting Grant (StG), LS5, ERC-2011-StG_20101109

SummaryBrain consists of two basic cell types – neurons and glia. However, the study of glia in brain function has traditionally been neglected in favor of their more “illustrious” counter-parts – neurons that are classed as the computational units of the brain. Glia have usually been classed as “brain glue” - a supportive matrix on which neurons grow and function. However, recent evidence suggests that glia are more than passive “glue” and actually modulate neuronal function. This has lead to the proposal of a “tripartite synapse”, which recognizes pre- and postsynaptic neuronal elements and glia as a unit.
However, what is still lacking is rudimentary information on how these cells actually function in situ. Here we propose taking a “bottom-up” approach, by identifying the molecules (and interactions) that control glial function in situ. This is complicated by the fact that glia show profound changes when placed into culture. To circumvent this, we will use recently developed cell sorting techniques, to rapidly isolate genetically marked glial cells from brain – which can then be analyzed using advanced biochemical and physiological techniques. The long-term aim is to identify proteins that can be “tagged” using transgenic technologies to allow protein function to be studied in real-time in vivo, using sophisticated imaging techniques. Given the number of proteins that may be identified we envisage developing new methods of generating transgenic animals that provide an attractive alternative to current “state-of-the art” technology.
The importance of studying glial function is given by the fact that every major brain pathology shows reactive gliosis. In the time it takes to read this abstract, 5 people in the EU will have suffered a stroke – not to mention those who suffer other forms of neurotrauma. Thus, understanding glial function is not only critical to understanding normal brain function, but also for relieving the burden of severe neurological injury and disease

Brain consists of two basic cell types – neurons and glia. However, the study of glia in brain function has traditionally been neglected in favor of their more “illustrious” counter-parts – neurons that are classed as the computational units of the brain. Glia have usually been classed as “brain glue” - a supportive matrix on which neurons grow and function. However, recent evidence suggests that glia are more than passive “glue” and actually modulate neuronal function. This has lead to the proposal of a “tripartite synapse”, which recognizes pre- and postsynaptic neuronal elements and glia as a unit.
However, what is still lacking is rudimentary information on how these cells actually function in situ. Here we propose taking a “bottom-up” approach, by identifying the molecules (and interactions) that control glial function in situ. This is complicated by the fact that glia show profound changes when placed into culture. To circumvent this, we will use recently developed cell sorting techniques, to rapidly isolate genetically marked glial cells from brain – which can then be analyzed using advanced biochemical and physiological techniques. The long-term aim is to identify proteins that can be “tagged” using transgenic technologies to allow protein function to be studied in real-time in vivo, using sophisticated imaging techniques. Given the number of proteins that may be identified we envisage developing new methods of generating transgenic animals that provide an attractive alternative to current “state-of-the art” technology.
The importance of studying glial function is given by the fact that every major brain pathology shows reactive gliosis. In the time it takes to read this abstract, 5 people in the EU will have suffered a stroke – not to mention those who suffer other forms of neurotrauma. Thus, understanding glial function is not only critical to understanding normal brain function, but also for relieving the burden of severe neurological injury and disease

Max ERC Funding

1 490 168 €

Duration

Start date: 2012-01-01, End date: 2016-12-31

Project acronymBIONET

ProjectNetwork Topology Complements Genome as a Source of Biological Information

Researcher (PI)Natasa Przulj

Host Institution (HI)UNIVERSITY COLLEGE LONDON

Call DetailsStarting Grant (StG), PE6, ERC-2011-StG_20101014

SummaryGenetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.

Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.

SummaryOur ability to think, to memorize and focus our thoughts depends on acetylcholine signaling in the brain. The loss of cholinergic signalling in for instance Alzheimer’s disease strongly compromises these cognitive abilities. The traditional view on the role of cholinergic input to the neocortex is that slowly changing levels of extracellular acetylcholine (ACh) mediate different arousal states. This view has been challenged by recent studies demonstrating that rapid phasic changes in ACh levels at the scale of seconds are correlated with focus of attention, suggesting that these signals may mediate defined cognitive operations. Despite a wealth of anatomical data on the organization of the cholinergic system, very little understanding exists on its functional organization. How the relatively sparse input of cholinergic transmission in the prefrontal cortex elicits such a profound and specific control over attention is unknown. The main objective of this proposal is to develop a causal understanding of how cellular mechanisms of fast acetylcholine signalling are orchestrated during cognitive behaviour.
In a series of studies, I have identified several synaptic and cellular mechanisms by which the cholinergic system can alter neuronal circuitry function, both in cortical and subcortical areas. I have used a combination of behavioral, physiological and genetic methods in which I manipulated cholinergic receptor functionality in prefrontal cortex in a subunit specific manner and found that ACh receptors in the prefrontal cortex control attention performance. Recent advances in optogenetic and electrochemical methods now allow to rapidly manipulate and measure acetylcholine levels in freely moving, behaving animals. Using these techniques, I aim to uncover which cholinergic neurons are involved in fast cholinergic signaling during cognition and uncover the underlying neuronal mechanisms that alter prefrontal cortical network function.

Our ability to think, to memorize and focus our thoughts depends on acetylcholine signaling in the brain. The loss of cholinergic signalling in for instance Alzheimer’s disease strongly compromises these cognitive abilities. The traditional view on the role of cholinergic input to the neocortex is that slowly changing levels of extracellular acetylcholine (ACh) mediate different arousal states. This view has been challenged by recent studies demonstrating that rapid phasic changes in ACh levels at the scale of seconds are correlated with focus of attention, suggesting that these signals may mediate defined cognitive operations. Despite a wealth of anatomical data on the organization of the cholinergic system, very little understanding exists on its functional organization. How the relatively sparse input of cholinergic transmission in the prefrontal cortex elicits such a profound and specific control over attention is unknown. The main objective of this proposal is to develop a causal understanding of how cellular mechanisms of fast acetylcholine signalling are orchestrated during cognitive behaviour.
In a series of studies, I have identified several synaptic and cellular mechanisms by which the cholinergic system can alter neuronal circuitry function, both in cortical and subcortical areas. I have used a combination of behavioral, physiological and genetic methods in which I manipulated cholinergic receptor functionality in prefrontal cortex in a subunit specific manner and found that ACh receptors in the prefrontal cortex control attention performance. Recent advances in optogenetic and electrochemical methods now allow to rapidly manipulate and measure acetylcholine levels in freely moving, behaving animals. Using these techniques, I aim to uncover which cholinergic neurons are involved in fast cholinergic signaling during cognition and uncover the underlying neuronal mechanisms that alter prefrontal cortical network function.

Max ERC Funding

1 499 242 €

Duration

Start date: 2011-11-01, End date: 2016-10-31

Project acronymBRiCPT

ProjectBasic Research in Cryptographic Protocol Theory

Researcher (PI)Jesper Buus Nielsen

Host Institution (HI)AARHUS UNIVERSITET

Call DetailsStarting Grant (StG), PE6, ERC-2011-StG_20101014

SummaryIn cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.

In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.

SummaryWe aim to develop and apply a suite of new technologies in a novel cancer discovery platform that will link high-definition cancer biology, via state-of-the-art disease imaging and pathway modelling, with development of novel interrogative and therapeutic interventions to test in models of cancer that closely resemble human disease. The work will lead to a new understanding of cancer invasion, how to treat advanced disease in the metastatic niche, how to monitor therapeutic responses and the compensatory mechanisms that cause acquired resistance. Platform development will be based on combined, cross-informing technologies that will enable us to predict optimal ‘maintenance therapies’ for metastatic disease by targeting cancer evolution and spread through combination therapy. A key strand of the platform is the development of quantitative multi-modal imaging in vivo by use of optical window technology to inform detailed understanding of disease and drug mechanisms and predictive capability of pathway biomarkers. Innovative methodologies are urgently needed to address declining approval rates of novel medicines and the unmet clinical needs of treating cancer patients in the advanced disease setting, where tumour spread and survival generally continues unchecked by current therapies. This work will be largely pre-clinical, but will always be mindful of the clinical problem in managing late stage human disease through rationale design of combination therapies with companion diagnostic tests. The cancer survival statistics will be changed if we can curb continuing spread of aggressive, metastatic disease and resistance to therapy by taking smarter combined approaches that make best use of emerging technologies in an innovative way, particularly where they are more predictive of clinical efficacy.

We aim to develop and apply a suite of new technologies in a novel cancer discovery platform that will link high-definition cancer biology, via state-of-the-art disease imaging and pathway modelling, with development of novel interrogative and therapeutic interventions to test in models of cancer that closely resemble human disease. The work will lead to a new understanding of cancer invasion, how to treat advanced disease in the metastatic niche, how to monitor therapeutic responses and the compensatory mechanisms that cause acquired resistance. Platform development will be based on combined, cross-informing technologies that will enable us to predict optimal ‘maintenance therapies’ for metastatic disease by targeting cancer evolution and spread through combination therapy. A key strand of the platform is the development of quantitative multi-modal imaging in vivo by use of optical window technology to inform detailed understanding of disease and drug mechanisms and predictive capability of pathway biomarkers. Innovative methodologies are urgently needed to address declining approval rates of novel medicines and the unmet clinical needs of treating cancer patients in the advanced disease setting, where tumour spread and survival generally continues unchecked by current therapies. This work will be largely pre-clinical, but will always be mindful of the clinical problem in managing late stage human disease through rationale design of combination therapies with companion diagnostic tests. The cancer survival statistics will be changed if we can curb continuing spread of aggressive, metastatic disease and resistance to therapy by taking smarter combined approaches that make best use of emerging technologies in an innovative way, particularly where they are more predictive of clinical efficacy.