The Application Developers Alliance, which *surprise* represents app developers, is just a couple years old and has roughly 40,000 individual and nearly 180 companies as members. The policy issues they focus on are data and patents and Technocrat talked with the group’s vice president for policy, law and government affairs, Tim Sparapani. He was previously Facebook’s public policy director and senior legislative counsel at the American Civil Liberties Union.

Q: What are your top policy issues that you’re working on right now?

A: Well, it’ll be no surprise that most of them revolve around data.

And, you know, because our members are the experts in how to build new and novel technologies, using both businesses’ and the public’s data, there are a whole host of questions that arise from that.

So, they sort of span the globe of things. But mostly it’s about how we can use data wisely and well to benefit consumers and the public writ large.

…

Sometimes people sort of truncate this by calling it a privacy debate. Well, it’s a lot more than that. You know, it’s a really a sort of a debate about whether data can be used to solve a series of societal problems, as our members believe it can be. And whether we can provide increasingly customized and personalized services and benefits to individuals, which give them tools and services that before the app industry arose used to cost them a whole lot of money, and now we can hopefully give them to them for free or nearly so. So it’s also about consumer benefit.

(During the talk, Sparapani mentioned hypothetical versus real risks of data, and Technocrat asked him for more details.)

Q: So what are some of these hypothetical risks that you’ve heard that you think are not actually happening but people are worried about?

A: Well, so having come from the ACLU in the past in my career where I led the privacy effort there, I feel I have some credibility on the subject matter.

There is a lot of talk of data being used against people. Or that the collection itself of data, to be more concrete, is itself a violation of people’s privacy.

And I think we have come to espouse a belief that everybody is constantly casting off data and that the way to protect people is not to worry about data collection but rather to prevent misuse of data.

And it is almost doctrinal within privacy thinking of the past that somehow you should limit the amount of data collected and then therefore you should afterwards minimize or mask it or, you know, limit access to it. But it turns out… those don’t prove to be terrifically beneficial to consumers. When you do that, you do two things. You both, one, can’t guarantee its subsequent protection. Think of all the data breaches we’ve had. Right?

…

But what really matters is… if your health data is accidentally breached or intentionally breached… you know, does somebody use that health data to prevent you from getting a job or being insured, at a reasonable rate? That’s the stuff that really matters. Right? And that’s the stuff our membership is focused on — those data misuses, not the moment of collection.

…

The other thing you do when you focus on a collection or minimization regime for data privacy protection is that you also prevent businesses or other researchers from discovering important insights or new novel products or services they can offer from the data a consumer gave for one purpose when they could be used for a second or a third purpose.