A Look at the Future of Privacy Notices (If They Have a Future)

How bad is the situation for privacy notices? The National Science Foundation just used part of its largest grant program, a Frontier award of well over $1 million, to fund a team of researchers looking to fix them.

And, to be clear, “We try to look at society’s biggest challenges and the things that really matter,” said Lisa-Joy Zgorski, a spokesperson for the National Science Foundation, “things that affect lives and jobs, and we tackle these issues with the requests for proposals.”

The project in question, “Towards Effective Web Privacy Notice and Choice: A Multi-Disciplinary Perspective,” is led by researchers at Carnegie Mellon University and includes teams at both Fordham and Stanford. They hope to use advances in machine-learning, crowd-sourcing and graphic design to take the often-boilerplate privacy polices found on virtually every website nowadays and make them much more digestible and useful for the average web surfer.

“Nobody really understands these privacy policies,” said Nina Amla, one of the program managers at the National Science Foundation’s Computer and Information Science and Engineering Directorate. “They’re pages long, and the few people who do read them don’t come away with much information about how to make decisions about visiting that website or not.”

Of course, Amla is not the first to make those observations. Attendees at the IAPP’s Navigate event were treated to a presentation by Carnegie Mellon’s Jason Hong that showed you’d need to devote some 25 days a year to reading privacy notices if you actually read the notice on every site you, as an average surfer, visit.

“Every time I teach privacy, I ask for a show of hands to see who has read a privacy policy,” said Norman Sadeh, the project’s lead investigator at Carnegie Mellon. He starts by asking who’s read a privacy notice in the last month. No hands go up. The last year? No hands go up. Ever? “Maybe we’ll get a few hands,” he laughed, “but it’s a very tiny minority in the room.”

“I think we all realize that very few people read these polices,” he said, “and even if you do read them, you can’t answer the most basic questions about them.”

While researchers like Lorrie Cranor, who’s on Sadeh’s team, have looked at asking websites to use something more akin to a nutritional label, “we’re seeing that website operators are not necessarily keen to do much more than what they’ve already been doing,” Sadeh said.

With some background in machine learning and natural language studies, Sadeh a while back started to wonder if those kinds of technologies might be applied to privacy notices that tend to share a lot of similar language and patterns, so as to automatically answer those questions about privacy notices that are most important to the consumers visiting the sites.

Essentially, he asked, “can we take these policies in their ugliness and extract something meaningful out of them?”

The end result, he said, might be a browser plugin that displays a very simple color, or a letter grade, when someone visits a website that’s been evaluated by the program. It’s unlikely, said Sadeh, that what he’s envisioning could be done in real-time, but the sweeps of the kind done by privacy commissioners, for example, could be made much more efficient.

To that end, he’s assembled a team from the three universities with backgrounds in areas like legal research, public policy and human-computer interaction, in addition to privacy.

Once some research is done on what the privacy questions are that consumers really care about the answers to, and how to best to gather an online crowd interested in being a source of information, the workflow might look something like this:

First, the text of the privacy notice is ingested by the software. It pulls out the portions of the policy it believes answers the five-to-seven questions most important to consumers and offers up what it believes are the answers. The crowd online confirms or corrects those answers, and then a score for the site is generated. That score is recorded and added to the database. Finally, when someone next visits that site with the plugin installed, the score is displayed and the consumer can make a decision on whether to simply proceed or dive deeper into what the answers are to those important questions.

“Maybe we can find answers that matter to users,” said Sadeh, “though the answers may or may not be doable depending on what the policies do say. Some of them do a great job of never answering a question that you care about, and sometimes that’s very revealing. If they don’t make a statement about a valid question, then that’s an issue, and I can probably get a crowd to help me with pointing that out.”

At the end of the project’s three-and-a-half years, the hope, said Sadeh, is that “I can do this on a massive scale and we can start automating the sweeps … We might be able to see how policies evolve, or check how new regulations are being addressed, maybe even inform regulators and get them to impose various sanctions.

“We all realize that in many different domains,” he continued, “there’s been a rush to the bottom in terms of privacy practices, and the idea of self-regulation and that people would start competing on privacy polices, well, that was wishful thinking and that remains wishful thinking. But, if one day you can distill all this information so that it’s much easier for a user to digest, then maybe you find yourself with that actually happening.

Comments

Related Stories

We know, there's lots of privacy news, guidance and documentation to keep up with every day. And we also know, you're busy doing all the things required of the modern privacy professional. Sure, we distill the latest news and relevant content down in the Daily Dashboard and our weekly regional diges...

Privacy Commissioner of Canada Daniel Therrien believes the question about whether privacy legislation should be amended is in the past. It is no longer should the country's privacy laws be amended, but what is the best way to do so, and with the announcement of the country's Digital Charter, the co...

Elijah Cummings, chairman of the U.S. House Committee on Oversight and Government Reform, remembers well his interaction with facial-recognition technology. It was 2015, and protestors in his native Baltimore were taking to the streets over the death of Freddie Gray. Cummings was there; it was impor...

At a hearing at the U.S. Senate Committee on the Judiciary Tuesday, lawmakers grappled with how to handle regulating the advertising technology industry and whether current anti-trust laws are equipped to address concerns that tech behemoths that own the most user data may have a monopoly.
Just a ...

Each week, the IAPP offers up this top-five list to help distill all the privacy-related content that gets published on a weekly basis. Sure, we try to boil it down each day with the Daily Dashboard and our weekly regional digests, but we know our readers are very busy being privacy pro rock stars. ...

The IAPP is the largest and most comprehensive global information privacy community and resource. Founded in 2000, the IAPP is a not-for-profit organization that helps define, support and improve the privacy profession globally.

The IAPP is the only place you’ll find a comprehensive body of resources, knowledge and experts to help you navigate the complex landscape of today’s data-driven world. We offer individual, corporate and group memberships, and all members have access to an extensive array of benefits.