ERC FUNDED PROJECTS

ProjectActively Enhanced Cognition based Framework for Design of Complex Systems

Researcher (PI)Björn Ottersten

Host Institution (HI)UNIVERSITE DU LUXEMBOURG

Call DetailsAdvanced Grant (AdG), PE7, ERC-2016-ADG

SummaryParameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.

Parameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.

SummaryWe propose focused theory developments and applications, which aim to substantially advance our ability to model and understand the behavior of molecules in complex environments. From a large repertoire of possible environments, we have chosen to concentrate on experimentally-relevant situations, including molecular fluctuations in electric and optical fields, disordered molecular crystals, solvated (bio)molecules, and molecular interactions at/through low-dimensional nanostructures. A challenging aspect of modeling such realistic environments is that both molecular electronic and nuclear fluctuations have to be treated efficiently at a robust quantum-mechanical level of theory for systems with 1000s of atoms. In contrast, the current state of the art in the modeling of complex molecular systems typically consists of Newtonian molecular dynamics employing classical force fields. We will develop radically new approaches for electronic and nuclear fluctuations that unify concepts and merge techniques from quantum-mechanical many-body Hamiltonians, statistical mechanics, density-functional theory, and machine learning. Our developments will be benchmarked using experimental measurements with terahertz (THz) spectroscopy, atomic-force and scanning tunneling microscopy (AFM/STM), time-of-flight (TOF) measurements, and molecular interferometry.
Our final goal is to bridge the accuracy of quantum mechanics with the efficiency of force fields, enabling large-scale predictive quantum molecular dynamics simulations for complex systems containing 1000s of atoms, and leading to novel conceptual insights into quantum-mechanical fluctuations in large molecular systems. The project goes well beyond the presently possible applications and once successful will pave the road towards having a suite of first-principles-based modeling tools for a wide range of realistic materials, such as biomolecules, nanostructures, disordered solids, and organic/inorganic interfaces.

We propose focused theory developments and applications, which aim to substantially advance our ability to model and understand the behavior of molecules in complex environments. From a large repertoire of possible environments, we have chosen to concentrate on experimentally-relevant situations, including molecular fluctuations in electric and optical fields, disordered molecular crystals, solvated (bio)molecules, and molecular interactions at/through low-dimensional nanostructures. A challenging aspect of modeling such realistic environments is that both molecular electronic and nuclear fluctuations have to be treated efficiently at a robust quantum-mechanical level of theory for systems with 1000s of atoms. In contrast, the current state of the art in the modeling of complex molecular systems typically consists of Newtonian molecular dynamics employing classical force fields. We will develop radically new approaches for electronic and nuclear fluctuations that unify concepts and merge techniques from quantum-mechanical many-body Hamiltonians, statistical mechanics, density-functional theory, and machine learning. Our developments will be benchmarked using experimental measurements with terahertz (THz) spectroscopy, atomic-force and scanning tunneling microscopy (AFM/STM), time-of-flight (TOF) measurements, and molecular interferometry.
Our final goal is to bridge the accuracy of quantum mechanics with the efficiency of force fields, enabling large-scale predictive quantum molecular dynamics simulations for complex systems containing 1000s of atoms, and leading to novel conceptual insights into quantum-mechanical fluctuations in large molecular systems. The project goes well beyond the presently possible applications and once successful will pave the road towards having a suite of first-principles-based modeling tools for a wide range of realistic materials, such as biomolecules, nanostructures, disordered solids, and organic/inorganic interfaces.

Max ERC Funding

1 811 650 €

Duration

Start date: 2017-03-01, End date: 2022-02-28

Project acronymBugTheDrug

ProjectPredicting the effects of gut microbiota and diet on an individual’s drug response and safety

Researcher (PI)Ines THIELE

Host Institution (HI)UNIVERSITE DU LUXEMBOURG

Call DetailsStarting Grant (StG), LS7, ERC-2017-STG

SummaryPrecision medicine is an emerging paradigm that aims at maximizing the benefits and minimizing the harm of drugs. Realistic mechanistic models are needed to understand and limit heterogeneity in drug responses. Consequently, novel approaches are required that explicitly account for individual variations in response to environmental influences, in addition to genetic variation. The human gut microbiota metabolizes drugs and is modulated by diet, and it exhibits significant variation among individuals. However, the influence of the gut microbiota on drug failure or drug side effects is under-researched. In this study, I will combine whole-body, genome-scale molecular resolution modeling of human metabolism and human gut microbial metabolism, which represents a network of genes, proteins, and biochemical reactions, with physiological, clinically relevant modeling of drug responses. I will perform two pilot studies on human subjects to illustrate that this innovative, versatile computational modeling framework can be used to stratify patients prior to drug prescription and to optimize drug bioavailability through personalized dietary intervention. With these studies, BugTheDrug will advance mechanistic understanding of drug-microbiota-diet interactions and their contribution to individual drug responses. I will perform the first integration of cutting-edge approaches and novel insights from four distinct research areas: systems biology, quantitative systems pharmacology, microbiology, and nutrition. BugTheDrug conceptually and technologically addresses the demand for novel approaches to the study of individual variability, thereby providing breakthrough support for progress in precision medicine.

Precision medicine is an emerging paradigm that aims at maximizing the benefits and minimizing the harm of drugs. Realistic mechanistic models are needed to understand and limit heterogeneity in drug responses. Consequently, novel approaches are required that explicitly account for individual variations in response to environmental influences, in addition to genetic variation. The human gut microbiota metabolizes drugs and is modulated by diet, and it exhibits significant variation among individuals. However, the influence of the gut microbiota on drug failure or drug side effects is under-researched. In this study, I will combine whole-body, genome-scale molecular resolution modeling of human metabolism and human gut microbial metabolism, which represents a network of genes, proteins, and biochemical reactions, with physiological, clinically relevant modeling of drug responses. I will perform two pilot studies on human subjects to illustrate that this innovative, versatile computational modeling framework can be used to stratify patients prior to drug prescription and to optimize drug bioavailability through personalized dietary intervention. With these studies, BugTheDrug will advance mechanistic understanding of drug-microbiota-diet interactions and their contribution to individual drug responses. I will perform the first integration of cutting-edge approaches and novel insights from four distinct research areas: systems biology, quantitative systems pharmacology, microbiology, and nutrition. BugTheDrug conceptually and technologically addresses the demand for novel approaches to the study of individual variability, thereby providing breakthrough support for progress in precision medicine.

Max ERC Funding

1 687 458 €

Duration

Start date: 2018-04-01, End date: 2023-03-31

Project acronymELWar

ProjectElectoral Legacies of War: Political Competition in Postwar Southeast Europe

Researcher (PI)Josip GLAURDIC

Host Institution (HI)UNIVERSITE DU LUXEMBOURG

Call DetailsStarting Grant (StG), SH2, ERC-2016-STG

SummaryWe know remarkably little about the impact of war on political competition in postwar societies in spite of the fact that postwar elections have garnered tremendous interest from researchers in a variety of fields. That interest, however, has been limited to establishing the relationship between electoral democratization and the incidence of conflict. Voters’ and parties’ electoral behaviour after the immediate post‐conflict period have remained largely neglected by researchers. The proposed project will fill this gap in our understanding of electoral legacies of war by analysing the evolution of political competition over the course of more than two decades in the six postwar states of Southeast Europe: Bosnia-Herzegovina, Croatia, Kosovo, Macedonia, Montenegro, and Serbia. Organised around three thematic areas/levels of analysis – voters, parties, communities – the project will lead to a series of important contributions. Through a combination of public opinion research, oral histories, and the innovative method of matching of individual census entries, the project will answer to which extent postwar elections are decided by voters’ experiences and perceptions of the ended conflict, as opposed to their considerations of the parties’ peacetime economic platforms and performance in office. In-depth study of party documents and platforms, party relations with the organisations of the postwar civil sector, as well as interviews with party officials and activists will shed light on the influence of war on electoral strategies, policy preferences, and recruitment methods of postwar political parties. And a combination of large-N research on the level of the region’s municipalities and a set of paired comparisons of several communities in the different postwar communities in the region will help expose the mechanisms through which war becomes embedded into postwar political competition and thus continues to exert its influence even decades after the violence has ended.

We know remarkably little about the impact of war on political competition in postwar societies in spite of the fact that postwar elections have garnered tremendous interest from researchers in a variety of fields. That interest, however, has been limited to establishing the relationship between electoral democratization and the incidence of conflict. Voters’ and parties’ electoral behaviour after the immediate post‐conflict period have remained largely neglected by researchers. The proposed project will fill this gap in our understanding of electoral legacies of war by analysing the evolution of political competition over the course of more than two decades in the six postwar states of Southeast Europe: Bosnia-Herzegovina, Croatia, Kosovo, Macedonia, Montenegro, and Serbia. Organised around three thematic areas/levels of analysis – voters, parties, communities – the project will lead to a series of important contributions. Through a combination of public opinion research, oral histories, and the innovative method of matching of individual census entries, the project will answer to which extent postwar elections are decided by voters’ experiences and perceptions of the ended conflict, as opposed to their considerations of the parties’ peacetime economic platforms and performance in office. In-depth study of party documents and platforms, party relations with the organisations of the postwar civil sector, as well as interviews with party officials and activists will shed light on the influence of war on electoral strategies, policy preferences, and recruitment methods of postwar political parties. And a combination of large-N research on the level of the region’s municipalities and a set of paired comparisons of several communities in the different postwar communities in the region will help expose the mechanisms through which war becomes embedded into postwar political competition and thus continues to exert its influence even decades after the violence has ended.

SummaryA grand challenge in today’s materials research is the realization of flexible materials that are also intelligent and functional. They will be the enablers of true breakthroughs in the hot trends of soft robotics and wearable technology. The standard approach to the latter is to decorate rubber sheets with electronic components, yielding two serious flaws: rubber is uncomfortable as it does not breath and solid state electronics will eventually fail as a garment is flexed and stretched when worn. While the softness of rubber is ideal it must be used in the form of textile fibers to provide breathability, and for long-term failure resistance we need intelligent components that are soft. A solution to this conundrum was recently presented by the PI with the concept of liquid crystal (LC) electrospinning. The extreme responsiveness of LCs is transferred to a non-woven textile by incorporating the LC in the fiber core, yielding a smart flexible mat with sensory function. Moreover, it consumes no power, providing a further advantage over electronics-based approaches. In a second research line he uses microfluidics to make LC rubber microshells, functioning as autonomous actuators which may serve as innovative components for soft robotics, and photonic crystal shells. This interdisciplinary project presents an ambitious agenda to advance these new concepts to the realization of soft, stretchable intelligent materials of revolutionary character. Five specific objectives are in focus: 1) develop understanding of the dynamic response of LCs in these unconventional configurations; 2) establish interaction dynamics during polymerisation of an LC precursor; 3) elucidate LC response to gas exposure; 4) establish correlation between actuation response and internal order of curved LCE rubbers; and 5) assess usefulness of LC-functionalized fibers and polymerized LC shells, tubes and Janus particles in wearable sensors, soft robotic actuators and high-security identification tags.

A grand challenge in today’s materials research is the realization of flexible materials that are also intelligent and functional. They will be the enablers of true breakthroughs in the hot trends of soft robotics and wearable technology. The standard approach to the latter is to decorate rubber sheets with electronic components, yielding two serious flaws: rubber is uncomfortable as it does not breath and solid state electronics will eventually fail as a garment is flexed and stretched when worn. While the softness of rubber is ideal it must be used in the form of textile fibers to provide breathability, and for long-term failure resistance we need intelligent components that are soft. A solution to this conundrum was recently presented by the PI with the concept of liquid crystal (LC) electrospinning. The extreme responsiveness of LCs is transferred to a non-woven textile by incorporating the LC in the fiber core, yielding a smart flexible mat with sensory function. Moreover, it consumes no power, providing a further advantage over electronics-based approaches. In a second research line he uses microfluidics to make LC rubber microshells, functioning as autonomous actuators which may serve as innovative components for soft robotics, and photonic crystal shells. This interdisciplinary project presents an ambitious agenda to advance these new concepts to the realization of soft, stretchable intelligent materials of revolutionary character. Five specific objectives are in focus: 1) develop understanding of the dynamic response of LCs in these unconventional configurations; 2) establish interaction dynamics during polymerisation of an LC precursor; 3) elucidate LC response to gas exposure; 4) establish correlation between actuation response and internal order of curved LCE rubbers; and 5) assess usefulness of LC-functionalized fibers and polymerized LC shells, tubes and Janus particles in wearable sensors, soft robotic actuators and high-security identification tags.

Max ERC Funding

1 929 976 €

Duration

Start date: 2015-04-01, End date: 2020-03-31

Project acronymNanoThermo

ProjectEnergy Conversion and Information Processing at Small Scales

Researcher (PI)Massimiliano Gennaro Esposito

Host Institution (HI)UNIVERSITE DU LUXEMBOURG

Call DetailsConsolidator Grant (CoG), PE3, ERC-2015-CoG

SummaryThermodynamics provided mankind with the intellectual tools to master energy transfers and energy conversion in macroscopic systems operating close to equilibrium. It is now one of the most fundamental theories in physics. My goal is to establish a thermodynamic theory describing energy conversion and information processing in small synthetic or biological systems operating far from equilibrium. Significant progress has been achieved in this direction over the last decade. The new theory is called stochastic thermodynamics (ST). It allows us to describe and understand energy conversion in systems as diverse as quantum junctions and molecular motors, and also to predict the energetic cost of information processing operations such as erasing bits of information or feedback controlling a small device. It was validated in single molecule pulling experiments, electronic circuits, NMR and colloidal particles in optical tweezers. Nevertheless, ST still suffers from serious limitations which prevent its application in more complex systems. Therefore, I propose to expand the theoretical foundations of ST far beyond its current realm of validity and to broaden the scope of its applications in various new directions. I want to answer questions such as: Can one design devices made of many small energy converters (e.g. thermoelectric junctions) arranged in such a way as to generate collective behaviors (e.g. synchronization) prompting higher powers and efficiencies? Can one do the same by engineer quantum effects? How can one reduce the dissipation occurring when computing very quickly with small devices? Why are metabolic networks so efficient in converting energy, transmitting information, and preventing errors (e.g. toxic byproducts)? I will do so in close contact with leading experimental groups in the field. My conviction is that ST will become as important for nanotechnologies and molecular biology as thermodynamics has been for the industrial revolution.

Thermodynamics provided mankind with the intellectual tools to master energy transfers and energy conversion in macroscopic systems operating close to equilibrium. It is now one of the most fundamental theories in physics. My goal is to establish a thermodynamic theory describing energy conversion and information processing in small synthetic or biological systems operating far from equilibrium. Significant progress has been achieved in this direction over the last decade. The new theory is called stochastic thermodynamics (ST). It allows us to describe and understand energy conversion in systems as diverse as quantum junctions and molecular motors, and also to predict the energetic cost of information processing operations such as erasing bits of information or feedback controlling a small device. It was validated in single molecule pulling experiments, electronic circuits, NMR and colloidal particles in optical tweezers. Nevertheless, ST still suffers from serious limitations which prevent its application in more complex systems. Therefore, I propose to expand the theoretical foundations of ST far beyond its current realm of validity and to broaden the scope of its applications in various new directions. I want to answer questions such as: Can one design devices made of many small energy converters (e.g. thermoelectric junctions) arranged in such a way as to generate collective behaviors (e.g. synchronization) prompting higher powers and efficiencies? Can one do the same by engineer quantum effects? How can one reduce the dissipation occurring when computing very quickly with small devices? Why are metabolic networks so efficient in converting energy, transmitting information, and preventing errors (e.g. toxic byproducts)? I will do so in close contact with leading experimental groups in the field. My conviction is that ST will become as important for nanotechnologies and molecular biology as thermodynamics has been for the industrial revolution.

Max ERC Funding

1 669 029 €

Duration

Start date: 2016-07-01, End date: 2021-06-30

Project acronymRealTCut

ProjectTowards real time multiscale simulation of cutting in non-linear materials
with applications to surgical simulation and computer guided surgery

Researcher (PI)Stéphane Pierre Alain Bordas

Host Institution (HI)UNIVERSITE DU LUXEMBOURG

Call DetailsStarting Grant (StG), PE8, ERC-2011-StG_20101014

Summary"Surgeons are trained as apprentices. Some conditions are rarely encountered and surgeons will only be trained in the specific skills associated with a given situation if they come across it. At the end of their residency, it is hoped that they will have faced sufficiently many cases to be competent. This can be dangerous to the patients.
If we were able to reproduce faithfully, in a virtual environment, the audio, visual and haptic experience of a surgeon as they prod, pull and incise tissue, then, surgeons would not have to train on cadavers, phantoms, or on the patients themselves.
Only a few researchers in the Computational Mechanics community have attacked the mechanical problems related to surgical simulation, so that mechanical faithfulness is not on par with audiovisual. This lack of fidelity in the reproduction of surgical acts such as cutting may explain why most surgeons who tested existing simulators report that the ""sensation"" fed back to them remains unrealistic. To date, the proposers are not aware of Computational Mechanics solutions addressing, at the same time, geometrical faithfulness, material realism, evolving cuts and quality control of the solution.
The measurable objectives for this research are as follows:
O1:Significantly alleviate the mesh generation and regeneration burden to represent organs’ geometries, underlying tissue microstructure and cuts with sufficient accuracy but minimal user intervention
O2:Move away from simplistic coarse-scale material models by deducing tissue rupture at the organ level from constitutive (e.g. damage) and contact models designed at the meso and micro scales
O3:Ensure real-time results through model order reduction coupled with the multi-scale fracture tools of O2
O4:Control solution accuracy and validate against a range of biomechanics problems including real-life brain surgery interventions with the available at our collaborators’"

"Surgeons are trained as apprentices. Some conditions are rarely encountered and surgeons will only be trained in the specific skills associated with a given situation if they come across it. At the end of their residency, it is hoped that they will have faced sufficiently many cases to be competent. This can be dangerous to the patients.
If we were able to reproduce faithfully, in a virtual environment, the audio, visual and haptic experience of a surgeon as they prod, pull and incise tissue, then, surgeons would not have to train on cadavers, phantoms, or on the patients themselves.
Only a few researchers in the Computational Mechanics community have attacked the mechanical problems related to surgical simulation, so that mechanical faithfulness is not on par with audiovisual. This lack of fidelity in the reproduction of surgical acts such as cutting may explain why most surgeons who tested existing simulators report that the ""sensation"" fed back to them remains unrealistic. To date, the proposers are not aware of Computational Mechanics solutions addressing, at the same time, geometrical faithfulness, material realism, evolving cuts and quality control of the solution.
The measurable objectives for this research are as follows:
O1:Significantly alleviate the mesh generation and regeneration burden to represent organs’ geometries, underlying tissue microstructure and cuts with sufficient accuracy but minimal user intervention
O2:Move away from simplistic coarse-scale material models by deducing tissue rupture at the organ level from constitutive (e.g. damage) and contact models designed at the meso and micro scales
O3:Ensure real-time results through model order reduction coupled with the multi-scale fracture tools of O2
O4:Control solution accuracy and validate against a range of biomechanics problems including real-life brain surgery interventions with the available at our collaborators’"

SummarySoftware-intensive systems pervade modern society and industry. These systems often play critical roles from an economic, safety or security standpoint, thus making their dependability indispensible. Software Verification and Validation (V&V) is core to ensuring software dependability. The most prevalent V&V technique is testing, that is the automated, systematic, and controlled execution of a system to detect faults or to show compliance with requirements. Increasingly, we are faced with systems that are untestable, meaning that traditional testing methods are highly expensive, time-consuming or infeasible to apply due to factors such as the systems’ continuous interactions with the environment and the deep intertwining of software with hardware.
TUNE will enable testing of untestable systems by revolutionising how we think about test automation. Our key idea is to frame testing on models rather than operational systems. We refer to such testing as model testing. The models that underlie model testing are executable representations of the relevant aspects of a system and its environment, alongside the risks of system failures. Such models inevitably have uncertainties due to complex, dynamic environment behaviours and the unknowns about the system. This necessitates that model testing be uncertainty-aware.
We propose to develop scalable, practical and uncertainty-aware techniques for test automation, leveraging our expertise on model-driven engineering and automated testing. Our solutions will synergistically combine metaheuristic search with system and risk models to drive the search for critical faults that entail the most risk. TUNE is the first initiative with the specific goal of raising the level of abstraction of testing from operational systems to models. The project will bring early and cost-effective automation to the testing of many critical systems that defy existing automation techniques, thus significantly improving the dependability of such systems.

Software-intensive systems pervade modern society and industry. These systems often play critical roles from an economic, safety or security standpoint, thus making their dependability indispensible. Software Verification and Validation (V&V) is core to ensuring software dependability. The most prevalent V&V technique is testing, that is the automated, systematic, and controlled execution of a system to detect faults or to show compliance with requirements. Increasingly, we are faced with systems that are untestable, meaning that traditional testing methods are highly expensive, time-consuming or infeasible to apply due to factors such as the systems’ continuous interactions with the environment and the deep intertwining of software with hardware.
TUNE will enable testing of untestable systems by revolutionising how we think about test automation. Our key idea is to frame testing on models rather than operational systems. We refer to such testing as model testing. The models that underlie model testing are executable representations of the relevant aspects of a system and its environment, alongside the risks of system failures. Such models inevitably have uncertainties due to complex, dynamic environment behaviours and the unknowns about the system. This necessitates that model testing be uncertainty-aware.
We propose to develop scalable, practical and uncertainty-aware techniques for test automation, leveraging our expertise on model-driven engineering and automated testing. Our solutions will synergistically combine metaheuristic search with system and risk models to drive the search for critical faults that entail the most risk. TUNE is the first initiative with the specific goal of raising the level of abstraction of testing from operational systems to models. The project will bring early and cost-effective automation to the testing of many critical systems that defy existing automation techniques, thus significantly improving the dependability of such systems.