Abstract: We study collective decision-making in a voting game under the unanimity rule, with an ambiguous likelihood and ambiguity-averse voters who are MaxMin Expected Utility maximizers. We characterize the symmetric voting equilibria of this game, demonstrating that ambiguity helps reduce Type I errors: under ambiguity, voters are less likely to vote strategically against their information. Information aggregation improves as a result, and may even be restored to a fully informative equilibrium. We report evidence from a laboratory experiment supporting these predictions.

Date and time:2 June 11AM GMT (i.e., for example, noon in Paris, 11pm in AKL)

Abstract: Recent work by Bilò et al [2019] concerns allocating graph vertices (treated as indivisible objects) so that each share forms a connected subgraph, and so that no agent x envies another’s share “up to one outer good.” They obtain positive results that apply to arbitrarily many agents, but these are limited to Hamiltonian (aka traceable) graphs. What of the non-Hamiltonian case? We show that among topological classes of graphs, any non-Hamiltonian class has an upper bound on the number of agents for which fair shares are guaranteed. On the other hand, for the case of exactly 3 agents, positive results exist for some infinite, non-Hamiltonian graph classes. Our results – positive and negative – are obtained via transfer from related theorems in continuous fair division, but we must go beyond the standard model, which employs the unit interval [0,1] as the continuously divisible “cake.” Instead, we use several copies of [0,1] glued at their endpoints, to form the letter Y, or the figure 8, or the outline of a kiss . . . a “tangle.”

Abstract: We study collective decision-making in a voting game under the unanimity rule, with an ambiguous likelihood and ambiguity-averse voters who are MaxMin Expected Utility maximizers. We characterize the symmetric voting equilibria of this game, demonstrating that ambiguity helps reduce Type I errors: under ambiguity, voters are less likely to vote strategically against their information. Information aggregation improves as a result, and may even be restored to a fully informative equilibrium. We report evidence from a laboratory experiment supporting these predictions.

Abstract: Danilov, Karzanov and Koshevoy (2012) geometrically introduced an interesting operation of composition on Condorcet domains and using it they disproved a long-standing problem of Fishburn about the maximal size of connected Condorcet domains. We give an algebraic definition of this operation and investigate its properties. We give a precise formula for the cardinality of composition of two Condorcet domains and improve the Danilov, Karzanov and Koshevoy result showing that Fishburn’s alternating scheme does not always produce a largest connected Condorcet domain. I will outline some new exciting developments in the search of largest Condorcet domains.

Abstract: We investigate the strategic behavior of firms in a Hotelling spatial setting. The innovation is to combine two important features that are ubiquitous in real markets: (i) the location space is two-dimensional, often with physical restrictions on where firms can locate; (ii) consumers with some probability shop at firms other than the nearest. We characterise convergent Nash equilibria (CNE), in which all firms cluster at one point, for several alternative markets. In the benchmark case of a square convex market, we provide a new direct geometric proof of a result by Cox (1987) that CNE can arise in a sufficiently central part of the market. The convexity of the square space is of restricted realism, however, and we proceed to investigate networks, which more faithfully represent a stylised city’s streets. In the case of a grid, we characterise CNE, which exhibit several new phenomena. CNE in more central locations tend to be easier to support, echoing the unrestricted square case. However, CNE on the interior of edges differ substantially from CNE at nodes and follow quite surprising patterns. Our results also highlight the role of positive masses of indifferent consumers, which arise naturally in a network setting. In most previous models, in contrast, such masses cannot exist or are assumed away as unrealistic.

Abstract: Danilov, Karzanov and Koshevoy (2012) geometrically introduced an interesting operation of composition on Condorcet domains and using it they disproved a long-standing problem of Fishburn about the maximal size of connected Condorcet domains. We give an algebraic definition of this operation and investigate its properties. We give a precise formula for the cardinality of composition of two Condorcet domains and improve the Danilov, Karzanov and Koshevoy result showing that Fishburn’s alternating scheme does not always produce a largest connected Condorcet domain.

Abstract: We consider a decision maker who first chooses an information structure, and then chooses an action after receiving a signal. The cost of information may be either material or cognitive and is unobserved. Thus, cost must be inferred from observable behavior. We assume that the choice of action is observed, but the choice of information is not. Due to the unobservability of the acquired private information, the choice of action appears random from an outside analyst’s point of view. We show that, given only stochastic choice from menus of actions, an analyst can identify the agent’s taste (risk attitude), prior belief, and information cost function. Identification of the cost function from behavior stands in contrast with the large literature on applications of the rational inattention model where the functional form of the cost function is assumed known by the analyst (Sims 2003). In addition, we discuss the behavioral implications of our model which are weaker than some key properties of random expected utility models. In particular, the property of Monotonicity (the addition of a new action cannot increase the probabilities of choice of the existing actions) is violated in our model because of the endogeneity and hence menu-dependence of private information. However, two axioms that jointly weaken Monotonicity are satisfied. Finally, we provide necessary and sufficient conditions for stochastic choice to be rationalized by our model.

Abstract: The study of Rational Secret Sharing initiated by Halpern and Teague (2004) regards the reconstruction of the secret in secret sharing as a game. It was shown that participants (parties) may refuse to reveal their shares and so the reconstruction may fail. Moreover, a refusal to reveal the share may be a dominant strategy of a party. In this paper we consider secret sharing as a sub-action or subgame of a larger action/game where the secret opens a possibility of consumption of a certain common good. We claim that utilities of participants will be dependent on the nature of this common good. In particular, Halpern and Teague (2014)’s scenario corresponds to a rivalrous and excludable common good. We consider the case when this common good is non-rivalrous and non-excludable and many natural Nash equilibria. We list several applications of secret sharing to demonstrate our claim and give corresponding scenarios. In such circumstances the secret sharing scheme facilitates a power sharing agreement in the society. We also state that non-reconstruction may be beneficial for this society and give several examples.

Abstract: Unawareness is a form of bounded rationality where a person fails to conceive all feasible acts or consequences or to perceive as feasible all conceivable act-consequence links. We study the implications of unawareness for tort law, where relevant examples include the discovery of a new product or technology (new act), of a new disease or injury (new consequence), or that a product can cause an injury (new link). We argue that negligence has an important advantage over strict liability in a world with unawareness-negligence, through the stipulation of due care standards, spreads awareness about the updated probability of harm.

Abstract: The credibility of scientific findings is of fundamental importance to enhance future research. One potential approach of collecting information about this credibility is to elicit beliefs about the reproducibility of scientific claims among scientists. Four studies have recently used surveys and prediction markets to estimate beliefs about replication in systematic large scale replication projects, but the sample size in each study has been small. Here we pool data for the four studies (n = 104) to assess the performance of surveys and prediction markets. This dataset and subsequent analysis allow us to comment on the real world success of prediction markets in terms of eliciting and aggregating information.

Abstract: Prediction markets are popular tools to “crowd-source” distributed information and aggregate it into forecasts. Such forecasts can be very valuable for decision makers. Commercial companies, for instance, can benefit from accurate forecasts regarding the future demand for their products. Many decision-making problems, however, require conditional forecasts. To decide, for instance, between alternative marketing campaigns, a company needs to understand how each of the alternatives will affect sales. Finding mechanisms that properly incentivise participants to provide their information for such conditional forecasts is non-trivial, but can be done through so-called decision markets. I will kick off a mini-series on decision markets by describing how they work and giving an overview of our research activities on this topic.