Computation, Economics, and Game Theory

Archive for November, 2010

The National Science Foundation sent out a “Dear Colleague letter” some time ago requesting white papers on grand challenges in the social, behavioral, and economic sciences, and has now published those received. I find a non-trivial number of them to be of interest to the Algorithmic Game Theory/Economics community:

We revisit the problem of designing the profit-maximizing single-item auction, solved by Myerson in his seminal paper for the case in which bidder valuations are independently distributed. We focus on general joint distributions, seeking the optimal deterministic incentive compatible auction. We give a geometric characterization of the optimal auction through a duality theorem, resulting in an efficient algorithm for finding the optimal deterministic auction in the two-bidder case and an inapproximability result for three or more bidders.

We initiate the study of markets for private data, through the lens of differential privacy. Although the purchase and sale of private data has already begun on a large scale, a theory of privacy as a commodity is missing. In this paper, we propose to build such a theory. Specifically, we consider a setting in which a data analyst wishes to buy information from a population from which he can estimate some statistic. The analyst wishes to obtain an accurate estimate cheaply. On the other hand, the owners of the private data experience some cost for their loss of privacy, and must be compensated for this loss. Agents are selfish, and wish to maximize their profit, so our goal is to design truthful mechanisms. Our main result is that such auctions can naturally be viewed and optimally solved as variants of multi-unit procurement auctions. Based on this result, we derive auctions for two natural settings which are optimal up to small constant factors:
1. In the setting in which the data analyst has a fixed accuracy goal, we show that an application of the classic Vickrey auction achieves the analyst’s accuracy goal while minimizing his total payment.
2. In the setting in which the data analyst has a fixed budget, we give a mechanism which maximizes the accuracy of the resulting estimate while guaranteeing that the resulting sum payments do not exceed the analysts budget.
In both cases, our comparison class is the set of envy-free mechanisms, which correspond to the natural class of fixed-price mechanisms in our setting.
In both of these results, we ignore the privacy cost due to possible correlations between an individuals private data and his valuation for privacy itself. We then show that generically, no individually rational mechanism can compensate individuals for the privacy loss incurred due to their reported valuations for privacy.

A correlated equilibrium is defined by a linear program which, in the context under question, has exponentially many variables and polynomially many constraints. Papadimitriou and Roughgarden suggest running the Ellipsoid on the dual LP which thus has polynomially many variables and exponentially many constraints. The key step is providing the separating hyperplane oracle needed for the Ellipsoid algorithm, a feat that is accomplished leveraging the proof of existence in Hart and Schmeidler. The problem with using these separating hyperplanes in the algorithm is that they are not any of the original inequalities, which causes delicate numerical difficulties when trying to go back from the dual solution to the primal one (in our context, actually from the discovery of the in-feasibility of the dual to a primal solution). The new paper suggests a simple way to provide the Ellipsoid method with a violated inequality rather than with an arbitrary separating hyperplane. This makes going from the dual back to the primal trivial, as we simply get to eliminate all but a polynomially-large subset of the original variables, without any numerical issues. It also ensures that the final correlated equilibrium that we get has polynomial sized support.

The simple observation is that the Hart-Schmeidler separating hyperplane is a convex combination of inequalities, equivalently a probability distribution over them. Finding a a single violated inequality is thus a de-randomization challenge which in this case is solved using the classical method of conditional expectations.

The University of Minnesota Duluth chancellor appointed a committee in 2008 to develop “a campus-wide statement in terms of the acceptability of technology and web-based research and scholarship within the promotion and tenure process.

Newer forms of scholarship are emerging, often driven by technology tools. Scholarly web sites, blogs, software tools, electronic portfolios, video documentaries, and other newer forms may be considered as evidence of scholarly work at some institutions.

Some institutions accept that emerging forms of scholarship should count toward promotion and tenure. At the same time, few, if any, seem to have devised workable systems or metrics for documenting and evaluating such forms of scholarship.

Among its recommendations:

[…] departments […] may want to consider whether to include the broader areas of scholarship (research and creative work) as defined by Boyer (1990). In this book Boyer (1990) identifies four categories of scholarship […]:

the scholarship of discovery

the scholarship of integration

the scholarship of application

the scholarship of teaching

Faculty who wish to expand into newer forms of scholarship should plan in advance how to document the value of the work and recognize that some form of review by experts in the discipline must be included in the presentation of the work in a tenure or promotion document.