SummaryIn most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.

In most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.

Max ERC Funding

550 000 €

Duration

Start date: 2008-07-01, End date: 2013-06-30

Project acronymBayesianMarkets

ProjectBayesian markets for unverifiable truths

Researcher (PI)Aurelien Baillon

Host Institution (HI)ERASMUS UNIVERSITEIT ROTTERDAM

Call DetailsStarting Grant (StG), SH1, ERC-2014-STG

SummarySubjective data play an increasing role in modern economics. For instance, new welfare measurements are based on people’s subjective assessments of their happiness or their life satisfaction. A problem of such measurements is that people have no incentives to tell the truth. To solve this problem and make those measurements incentive compatible, I will introduce a new market institution, called Bayesian markets.
Imagine we ask people whether they are happy with their life. On Bayesian markets, they will trade an asset whose value is the proportion of people answering Yes. Only those answering Yes will have the right to buy the asset and those answering No the right to sell it. Bayesian updating implies that “Yes” agents predict a higher value of the asset than “No” agents do and, consequently, “Yes” agents want to buy it while “No” agents want to sell it. I will show that truth-telling is then the optimal strategy.
Bayesian markets reward truth-telling the same way as prediction markets (betting markets) reward people for reporting their true subjective probabilities about observable events. Yet, unlike prediction markets, they do not require events to be objectively observable. Bayesian markets apply to any type of unverifiable truths, from one’s own happiness to beliefs about events that will never be observed.
The present research program will first establish the theoretical foundations of Bayesian markets. It will then develop the proper methodology to implement them. Finally, it will disseminate the use of Bayesian markets via applications.
The first application will demonstrate how degrees of expertise can be measured and will apply it to risks related to climate change and nuclear power plants. It will contribute to the political debate by shedding new light on what true experts think about these risks. The second application will provide the first incentivized measures of life satisfaction and happiness.

Subjective data play an increasing role in modern economics. For instance, new welfare measurements are based on people’s subjective assessments of their happiness or their life satisfaction. A problem of such measurements is that people have no incentives to tell the truth. To solve this problem and make those measurements incentive compatible, I will introduce a new market institution, called Bayesian markets.
Imagine we ask people whether they are happy with their life. On Bayesian markets, they will trade an asset whose value is the proportion of people answering Yes. Only those answering Yes will have the right to buy the asset and those answering No the right to sell it. Bayesian updating implies that “Yes” agents predict a higher value of the asset than “No” agents do and, consequently, “Yes” agents want to buy it while “No” agents want to sell it. I will show that truth-telling is then the optimal strategy.
Bayesian markets reward truth-telling the same way as prediction markets (betting markets) reward people for reporting their true subjective probabilities about observable events. Yet, unlike prediction markets, they do not require events to be objectively observable. Bayesian markets apply to any type of unverifiable truths, from one’s own happiness to beliefs about events that will never be observed.
The present research program will first establish the theoretical foundations of Bayesian markets. It will then develop the proper methodology to implement them. Finally, it will disseminate the use of Bayesian markets via applications.
The first application will demonstrate how degrees of expertise can be measured and will apply it to risks related to climate change and nuclear power plants. It will contribute to the political debate by shedding new light on what true experts think about these risks. The second application will provide the first incentivized measures of life satisfaction and happiness.

Max ERC Funding

1 500 000 €

Duration

Start date: 2016-01-01, End date: 2020-12-31

Project acronymCRITIQUEUE

ProjectCritical queues and reflected stochastic processes

Researcher (PI)Johannes S.H. Van Leeuwaarden

Host Institution (HI)TECHNISCHE UNIVERSITEIT EINDHOVEN

Call DetailsStarting Grant (StG), PE1, ERC-2010-StG_20091028

SummaryOur primary motivation stems from queueing theory, the branch of applied probability that deals with congestion phenomena. Congestion levels are typically nonnegative, which is why reflected stochastic processes arise naturally in queueing theory. Other applications of reflected stochastic processes are in the fields of branching processes and random graphs.
We are particularly interested in critically-loaded queueing systems (close to 100% utilization), also referred to as queues in heavy traffic. Heavy-traffic analysis typically reduces complicated queueing processes to much simpler (reflected) limit processes or scaling limits. This makes the analysis of complex systems tractable, and from a mathematical point of view, these results are appealing since they can be made rigorous. Within the large
body of literature on heavy-traffic theory and critical stochastic processes, we launch two new research lines:
(i) Time-dependent analysis through scaling limits.
(ii) Dimensioning stochastic systems via refined scaling limits and optimization.
Both research lines involve mathematical techniques that combine stochastic theory with asymptotic theory, complex analysis, functional analysis, and modern probabilistic methods. It will provide a platform enabling collaborations between researchers in pure and applied probability and researchers in performance analysis of queueing systems. This will particularly be the case at TU/e, the host institution, and at
the affiliated institution EURANDOM.

Our primary motivation stems from queueing theory, the branch of applied probability that deals with congestion phenomena. Congestion levels are typically nonnegative, which is why reflected stochastic processes arise naturally in queueing theory. Other applications of reflected stochastic processes are in the fields of branching processes and random graphs.
We are particularly interested in critically-loaded queueing systems (close to 100% utilization), also referred to as queues in heavy traffic. Heavy-traffic analysis typically reduces complicated queueing processes to much simpler (reflected) limit processes or scaling limits. This makes the analysis of complex systems tractable, and from a mathematical point of view, these results are appealing since they can be made rigorous. Within the large
body of literature on heavy-traffic theory and critical stochastic processes, we launch two new research lines:
(i) Time-dependent analysis through scaling limits.
(ii) Dimensioning stochastic systems via refined scaling limits and optimization.
Both research lines involve mathematical techniques that combine stochastic theory with asymptotic theory, complex analysis, functional analysis, and modern probabilistic methods. It will provide a platform enabling collaborations between researchers in pure and applied probability and researchers in performance analysis of queueing systems. This will particularly be the case at TU/e, the host institution, and at
the affiliated institution EURANDOM.

Max ERC Funding

970 800 €

Duration

Start date: 2010-08-01, End date: 2016-07-31

Project acronymLIE ANALYSIS

ProjectLie Group Analysis for Medical Image Processing

Researcher (PI)Remco Duits

Host Institution (HI)TECHNISCHE UNIVERSITEIT EINDHOVEN

Call DetailsStarting Grant (StG), PE1, ERC-2013-StG

SummaryThe aim of this project is to substantially improve computer algorithms for image analysis in medical imaging. Currently available techniques often require significant application-specific tuning and have a limited application scope. This is mostly due to the use of non-generic feature spaces that involve many physical dimensions and lack mathematical foundation.
Instead, we derive inspiration from the superior generic pattern recognition capabilities of the human brain and propose a novel operator design aiming at better results and wider applicability.
This novel operator design combines (partial and ordinary) differential equations on non-compact Lie groups (induced by stochastic processes and sub-Riemannian geometric control) with wavelet transforms. Many mathematical challenges arise in the analysis and (numerical) solutions of these operators.
The research departs from previously developed insights of the PI on 'invertible orientation scores', which can be regarded as a specific instance in a general Lie group theoretical framework. Within this general framework one obtains a comprehensive invertible score defined on a higher dimensional Lie group beyond position space. The key challenge is to appropriately exploit these scores, their survey of multiple features per position, their underlying group structure, and their invertibility. We will tackle this via left-invariant evolutions and left-invariant sub-Riemannian optimal control within the score.
The orientation score approach will be systematically extended towards multi-scale-and-orientation, multi-velocity and multi-frequency encoding and processing, widening the application scope. Moreover, improvements in contextual enhancement via invertible scores and improvements in optimal curve extractions in the Lie group domain of the score will be pursued.
We will develop and apply the resulting algorithms to a wide range of medical imaging challenges in neurological, retinal and cardiac applications.

The aim of this project is to substantially improve computer algorithms for image analysis in medical imaging. Currently available techniques often require significant application-specific tuning and have a limited application scope. This is mostly due to the use of non-generic feature spaces that involve many physical dimensions and lack mathematical foundation.
Instead, we derive inspiration from the superior generic pattern recognition capabilities of the human brain and propose a novel operator design aiming at better results and wider applicability.
This novel operator design combines (partial and ordinary) differential equations on non-compact Lie groups (induced by stochastic processes and sub-Riemannian geometric control) with wavelet transforms. Many mathematical challenges arise in the analysis and (numerical) solutions of these operators.
The research departs from previously developed insights of the PI on 'invertible orientation scores', which can be regarded as a specific instance in a general Lie group theoretical framework. Within this general framework one obtains a comprehensive invertible score defined on a higher dimensional Lie group beyond position space. The key challenge is to appropriately exploit these scores, their survey of multiple features per position, their underlying group structure, and their invertibility. We will tackle this via left-invariant evolutions and left-invariant sub-Riemannian optimal control within the score.
The orientation score approach will be systematically extended towards multi-scale-and-orientation, multi-velocity and multi-frequency encoding and processing, widening the application scope. Moreover, improvements in contextual enhancement via invertible scores and improvements in optimal curve extractions in the Lie group domain of the score will be pursued.
We will develop and apply the resulting algorithms to a wide range of medical imaging challenges in neurological, retinal and cardiac applications.

Max ERC Funding

1 267 550 €

Duration

Start date: 2014-01-01, End date: 2018-12-31

Project acronymNew-Poetry

ProjectNew Advances through the boundaries of Poisson Geometry

Researcher (PI)Marius Crainic

Host Institution (HI)UNIVERSITEIT UTRECHT

Call DetailsStarting Grant (StG), PE1, ERC-2011-StG_20101014

SummaryThis project proposes new research directions that originate in the field of Poisson Geometry and which reach out towards other fields of Differential Geometry and Topology. It can also be seen as the development of a new field- Poisson Topology, the birth of which is clearly predicted by the recent results of the PI on stability of symplectic leaves.
Aims:
1. solving some of the most fundamental open problems in Poisson Geometry.
2. breaking the current boundaries of Poisson Geometry and bringing it at the forefront of the interplay between other fields in geometry (Foliation Theory, Symplectic Geometry etc).
Methods/tools:
1. build on the breakthrough results of the PI (and his collaborators) such as the one on the integrability of Lie algebroids or the geometric approach to Conn-Weinstein theorem.
2. new tools in Poisson Geometry such as Nonlinear Functional Analysis or the use of the fundamental ideas of Cartan that were not yet exploited in Poisson Geometry (G-structures, Exterior Differential Systems, etc ).
New directions: I propose several interacting directions/subprojects, each one of independent interest. For example:
- study of local invariants in Poisson Geometry (a wide-open problem) based on PI's work and the use of ideas from G-structures.
- a new unified approach to stability theories, such as Mather's theory or Nijenhuis-Richardson's
(apparently unrelated!). Poisson Geometry plays an unifying role. We expect new fundamental results in Poisson Topology and related fields (including moduli spaces of flat connections).
- the study of global aspects of Poisson Geometry. E.g. the existence of codimension one Poisson structures on spheres (the 5-dimensional sphere was settled only last year!). Recall that the similar problem in Foliation Theory served as a driving force for the field (and will be used here). Global aspects will also take us to the world of Symplectic Topology- a high risk/high return journey that has never been taken before

This project proposes new research directions that originate in the field of Poisson Geometry and which reach out towards other fields of Differential Geometry and Topology. It can also be seen as the development of a new field- Poisson Topology, the birth of which is clearly predicted by the recent results of the PI on stability of symplectic leaves.
Aims:
1. solving some of the most fundamental open problems in Poisson Geometry.
2. breaking the current boundaries of Poisson Geometry and bringing it at the forefront of the interplay between other fields in geometry (Foliation Theory, Symplectic Geometry etc).
Methods/tools:
1. build on the breakthrough results of the PI (and his collaborators) such as the one on the integrability of Lie algebroids or the geometric approach to Conn-Weinstein theorem.
2. new tools in Poisson Geometry such as Nonlinear Functional Analysis or the use of the fundamental ideas of Cartan that were not yet exploited in Poisson Geometry (G-structures, Exterior Differential Systems, etc ).
New directions: I propose several interacting directions/subprojects, each one of independent interest. For example:
- study of local invariants in Poisson Geometry (a wide-open problem) based on PI's work and the use of ideas from G-structures.
- a new unified approach to stability theories, such as Mather's theory or Nijenhuis-Richardson's
(apparently unrelated!). Poisson Geometry plays an unifying role. We expect new fundamental results in Poisson Topology and related fields (including moduli spaces of flat connections).
- the study of global aspects of Poisson Geometry. E.g. the existence of codimension one Poisson structures on spheres (the 5-dimensional sphere was settled only last year!). Recall that the similar problem in Foliation Theory served as a driving force for the field (and will be used here). Global aspects will also take us to the world of Symplectic Topology- a high risk/high return journey that has never been taken before