Differential Privacy (DP) is one of the most successful approaches to prevent disclosure of private information in statistical databases. It provides a formal privacy guarantee, ensuring that sensitive information relative to individuals cannot be easily inferred by disclosing answers to aggregate queries. If two databases are adjacent, i.e. differ only for an individual, then the query should not allow to tell them apart by more than a certain factor. This induces a bound also on the distinguishability of two generic databases, which is determined by their distance on the Hamming graph of the adjacency relation.

In this talk we lift the restriction on the adjacency relation and we explore the implications of DP when the indistinguishability requirement depends on an arbitrary notion of distance. We show that we can naturally express, in this way, (protection against) privacy threats that cannot be represented with the standard notion, leading to new applications of the DP framework. We give intuitive characterizations of these threats in terms of Bayesian adversaries, which generalize two interpretations of (standard) differential privacy from the literature. We revisit the well-known results in the literature of differential privacy stating that universally optimal mechanisms exist only for counting queries. We show that, in our extended setting, universally optimal mechanisms exist for other queries too, notably sum, average, and percentile queries.

Finally, we explore an application of our generalized DP to the case of location privacy. In this case, the domain consists of the locations on a map and the distance is the geographical distance. This instance of the property, that we call geo-indistinguishability, is a formal notion of privacy for location-based systems that protects the user's exact location, while allowing approximate information -- typically needed to obtain a certain desired service -- to be released.

We describe how to use our mechanism to enhance LBS applications with geo-indistinguishability guarantees without compromising the quality of the application results. It turns out that, among the known mechanisms independent of the prior, our mechanism offers the best privacy guarantees. Finally we present a tool, Location Guard, based on our framework, that has become quite popular also among the general public.