TechTalks from event: IEEE FOCS 2010

51st Annual IEEE Symposium on Foundations of Computer Science

  • Pure and Bayes-Nash Price of Anarchy for Generalized Second Price Auction Authors: Renato Paes Leme and Eva Tardos
    Generalized Second Price Auction, also knows as Ad Word auctions, and its variants has been the main mechanism used by search companies to auction positions for sponsored search links. In this paper we study the social welfare of the Nash equilibria of this game. It is known that socially optimal Nash equilibria exists (i.e., that the Price of Stability for this game is 1). This paper is the first to prove bounds on the price of anarchy. Our main result is to show that under some mild assumptions the price of anarchy is small. For pure Nash equilibria we bound the price of anarchy by 1.618, assuming all bidders are playing un-dominated strategies. For mixed Nash equilibria we prove a bound of 4 under the same assumption. We also extend the result to the Bayesian setting when bidders valuations are also random, and prove a bound of 8 for this case. Our proof exhibits a combinatorial structure of Nash equilibria and use this structure to bound the price of anarchy. While establishing the structure is simple in the case of pure and mixed Nash equilibria, the extension to the Bayesian setting requires the use of novel combinatorial techniques that can be of independent interest.
  • Frugal and Truthful Auctions for Vertex Covers, Flows, and Cuts / Frugal Mechanism Design via Spectral Techniques Authors: David Kempe and Mahyar Salek and Cristopher Moore / Ning Chen and Edith Elkind and Nick Gravin and Fedor Petrov
    We study truthful mechanisms for hiring a team of agents in three classes of set systems: Vertex Cover auctions, k-flow auctions, and cut auctions. For Vertex Cover auctions, the vertices are owned by selfish and rational agents, and the auctioneer wants to purchase a vertex cover from them. For k-flow auctions, the edges are owned by the agents, and the auctioneer wants to purchase k edge-disjoint s-t paths, for given s and t. In the same setting, for cut auctions, the auctioneer wants to purchase an s-t cut. Only the agents know their costs, and the auctioneer needs to select a feasible set and payments based on bids made by the agents.

    We present constant-competitive truthful mechanisms for all three set systems. That is, the maximum overpayment of the mechanism is within a constant factor of the maximum overpayment of any truthful mechanism, for every set system in the class. The mechanism for Vertex Cover is based on scaling each bid by a multiplier derived from the dominant eigenvector of a certain matrix. The mechanism for k-flows prunes the graph to be minimally (k+1)-connected, and then applies the Vertex Cover mechanism. Similarly, the mechanism for cuts contracts the graph until all s-t paths have length exactly 2, and then applies the Vertex Cover mechanism.

  • Budget Feasible Mechanisms Authors: Yaron Singer
    We study a novel class of mechanism design problems in which the outcomes are constrained by the payments. This basic class of mechanism design problems captures many common economic situations, and yet it has not been studied, to our knowledge, in the past. We focus on the case of procurement auctions in which sellers have private costs, and the auctioneer aims to maximize a utility function on subsets of items, under the constraint that the sum of the payments provided by the mechanism does not exceed a given budget. Standard mechanism design ideas such as the VCG mechanism and its variants are not applicable here. We show that, for general functions, the budget constraint can render mechanisms arbitrarily bad in terms of the utility of the buyer. However, our main result shows that for the important class of submodular functions, a bounded approximation ratio is achievable. Better approximation results are obtained for subclasses of the submodular functions. We explore the space of budget feasible mechanisms in other domains and give a characterization under more restricted conditions.
  • Black-Box Randomized Reductions in Algorithmic Mechanism Design Authors: Shaddin Dughmi and Tim Roughgarden
    We give the first black-box reduction from arbitrary approximation algorithms to truthful approximation mechanisms for a non-trivial class of multi-parameter problems. Specifically, we prove that every packing problem that admits an FPTAS also admits a truthful-in-expectation randomized mechanism that is an FPTAS. Our reduction makes novel use of smoothed analysis, by employing small perturbations as a tool in algorithmic mechanism design. To argue that our perturbation schemes are incentive-compatible, we develop a ?duality? between linear perturbations of the objective function of an optimization problem and of its feasible set. Viewed from the dual perspective, our mechanisms are maximal-in-distributional-range and hence truthful in expectation.
  • Boosting and Differential Privacy Authors: Cynthia Dwork and Guy Rothblum and Salil Vadhan
    Boosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct {em privacy-preserving synopses} of the input database. These are data structures that yield, for a given set $Q$ of queries over an input database, reasonably accurate estimates of the responses to every query in~$Q$. Given a {em base synopsis generator} that takes a distribution on $Q$ and produces a ``weak'' synopsis that yields ``good'' answers for a majority of the weight in $Q$, our {em Boosting for Queries} algorithm obtains a synopsis that is good for all of~$Q$. We ensure privacy for the rows of the database, but the boosting is performed on the {em queries}.

    We provide an (inefficient) base synopsis generator for sets of {em arbitrary} low-sensitivity queries (queries whose answers do not vary much under the addition or deletion of a single row). This yields the first privacy-preserving synopsis generator for arbitrary low-sensitivity queries.

    Boosting is an iterative method. In our Boosting for Queries algorithm, each iteration incurs a certain privacy loss. In analyzing the cumulative privacy loss over many iterations, we obtain a bound on the {em expected} privacy loss from a single $eps$-dfp{} mechanism. Combining this with {em evolution of confidence} arguments from the literature, we get a fresh perspective -- and stronger bounds -- on the expected cumulative privacy loss due to multiple mechanisms, each providing $eps$-differential privacy or one of its relaxations, and each operating on (potentially) different, adaptively chosen, databases. snote{changed ``the first bounds'' to ``stronger bounds'', to deflect possible claims that such guarantees are folklore/obvious for $(keps,kdelta)$ type results}

    Finally, we can also view the input database as a training set in a learning algorithm, where each row corresponds to an element in the training set. Given the power and prevalence of boosting, it is natural t

  • A Multiplicative Weights Mechanism for Interactive Privacy-Preserving Data Analysis Authors: Moritz Hardt and Guy Rothblum
    We consider statistical data analysis with online queries. In this setting a trusted curator maintains a database of sensitive information about individual participants, and releases privacy-preserving answers to online queries as they arrive. Our primary contribution is a new differentially private multiplicative weights mechanism for answering a large number of counting queries that arrive online and may be adaptively chosen. This mechanism yields two main results.

    First, it is the first {em efficient} mechanism for answering large numbers of online queries with worst-case accuracy guarantees (accuracy on every input database). The error is emph{optimal} in its dependence on the number of participants and depends only logarithmically on the number of queries being answered. The running time is nearly {em linear} in the size of the data universe. Previous mechanisms for answering many counting queries all run in super cubic time, even for the (easier) offline setting where all queries are known in advance.

    Second, while continuing to guarantee worst-case privacy for {em any} input database, we obtain exponential improvements in running time for a broad class of databases. When the input database is drawn from a {em smooth} distribution that does not place too much weight on any single data item, accuracy remains as above and the running time becomes {em poly-logarithmic} in the data universe size. To the best of our knowledge, this is the first example of a poly-logarithmic time mechanism for answering large numbers of general queries (indeed, for worst-case accuracy guarantees, there are known hardness results). Our main technical contributions are a new application of multiplicative weights techniques to the differential privacy setting, a new privacy analysis for multiplicative weights algorithms, and a general technique for reducing data dimensionality for databases drawn from smooth distributions.

  • Impossibility of Differentially Private Universally Optimal Mechanisms Authors: Hai Brenner and Kobbi Nissim
    The notion of {em a universally utility-maximizing privacy mechanism} was recently introduced by Ghosh, Roughgarden, and Sundararajan~[STOC 2009]. These are mechanisms that guarantee optimal utility to a large class of information consumers {em simultaneously}, while preserving {em Differential Privacy} [Dwork, McSherry, Nissim, and Smith, TCC 2006]. Ghosh et al. have demonstrated, quite surprisingly, a case where such a universally-optimal differentially-private mechanisms exists, when the information consumers are Bayesian. This result was recently extended by Gupte and Sundararajan~[PODS 2010] to risk-averse consumers.

    Both positive results deal with mechanisms (approximately) computing, a {em single count query} (i.e., the number of individuals satisfying a specific property in a given population), and the starting point of our work is a trial at extending these results to similar settings, such as sum queries with non-binary individual values, histograms, and two (or more) count queries. We show, however, that universally-optimal mechanisms do not exist for all these queries , both for Bayesian and risk-averse consumers.

    For the Bayesian case, we go further, and give a characterization of those functions that admit universally-optimal mechanisms, showing that a universally-optimal mechanism exists, essentially, only for a (single) count query. At the heart of our proof is a representation of a query function $f$ by its {em privacy constraint graph} $G_f$ whose edges correspond to values resulting by applying $f$ to neighboring databases.

  • The Limits of Two-Party Differential Privacy Authors: Andrew McGregor and Ilya Mironov and Toniann Pitassi and Omer Reingold and Kunal Talwar and Salil Vadhan
    We study differential privacy in a distributed setting where two parties would like to perform analysis of their joint data while preserving privacy for both datasets. Our results imply almost tight lower bounds on the accuracy of such data analyses, both for specific natural functions (such as Hamming distance) and in general. Our bounds expose a sharp contrast between the two-party setting and the simpler client-server setting (where privacy guarantees are one-sided). In addition, those bounds demonstrate a dramatic gap between the accuracy that can be obtained by differentially private data analysis versus the accuracy obtainable when privacy is relaxed to a computational variant of differential privacy.

    The first proof technique we develop demonstrates a connection between differential privacy and deterministic extraction from Santha-Vazirani sources. A second connection we expose indicates that the ability to approximate a function by a low-error differentially-private protocol is strongly related to the ability to approximate it by a low communication protocol (the connection goes in both ways).

  • Deciding first-order properties for sparse graphs Authors: Zdenek Dvorak and Daniel Kral and Robin Thomas
    We present a linear-time algorithm for deciding first-order logic (FOL) properties in classes of graphs with bounded expansion. Many natural classes of graphs have bounded expansion; for instance, graphs of bounded tree-width, all proper minor-closed classes of graphs, graphs of bounded degree, graphs with no subgraph isomorphic to a subdivision of a fixed graph, and graphs that can be drawn in a fixed surface in such a way that each edge crosses at most a constant number of other edges. We also develop an almost linear-time algorithm for deciding FOL properties in classes of graphs with locally bounded expansion; those include classes of graphs with locally bounded tree-width or locally excluding a minor.

    More generally, we design a dynamic data structure for graphs belonging to a class $cal G$ of graphs of bounded expansion. After a linear-time initialization the data structure allows us to test an FOL property in constant time, and the data structure can be updated in constant time after addition/deletion of an edge, provided the list of possible edges to be added is known in advance and their addition results in a graph in $cal G$. In addition, we design dynamic data structure for testing $Sigma_1$-properties or the existence of short paths between prescribed vertices in such classes of graphs. All our results hold for relational structures.

  • Logspace Versions of the Theorems of Bodlaender and Courcelle Authors: Michael Elberfeld and Andreas Jakoby and Till Tantau
    Bodlaender's Theorem states that for every $k$ there is a linear-time algorithm that decides whether an input graph has tree width~$k$ and, if so, computes a width-$k$ tree composition. Courcelle's Theorem builds on Bodlaender's Theorem and states that for every monadic second-order formula $phi$ and for every $k$ there is a linear-time algorithm that decides whether a given logical structure $mathcal A$ of tree width at most $k$ satisfies $phi$. We prove that both theorems still hold when ``linear time'' is replaced by ``logarithmic space.'' The transfer of the powerful theoretical framework of monadic second-order logic and bounded tree width to logarithmic space allows us to settle a number of both old and recent open problems in the logspace world.