Sample of the questions, see paper for full set These questions are meant as a starting point for inquiry, can add more questions where appropriate

Ex. Many people willing to always share location with family, but only between work hours with co-workers

A person’s name, which could be a legal name, or first or last name only; • A person’s address, which could be a mailing address, email address, homepage, blog, or instant messenger address; • A unique identifier, which could be a social security number or bank account number; • Names or pseudonyms that cannot be easily traced, for example a disposable identifier used for anonymous HIV testing; • A person’s appearance or behavior, for example web browsing habits, fashion style, or writing style; • A person’s social categorization, including “gender, ethnicity, religion, age, education, region, sexual orientation, linguistic patterns, organizational memberships and classifications, health status, employment, leisure activities… credit risk, IQ, SAT scores, life style categorization for mass marketing” [29]; and • A person’s relationship with others, who they are in love with, who they like, who they dislike, what services they use.

“I think this is disrespectful, demeaning and degrading” “I guess my question is how does this help the NURSE?” The second group of nurses was initially skeptical, but was won over because management did not abuse the system and because they eventually saw the value of such a system. One nurse wrote, “I admit, when we first started using it we all hated it for some of the same reasons cited above [in the message board] but I do think it is a timesaver! It is very frustrating when someone floats to our unit and doesn’t have a tracker…can’t find them for [doctor] calls, [patient] needs etc.” “At first, we hated it for various reasons, but mostly we felt we couldn’t take a bathroom break without someone knowing where we were…[but now] requests for medications go right to the nurse and bedpans etc go to the techs first. If they are tied up, then we get a reminder page and can take care of the pts needs. I just love [the locator system].”

Sample of the questions, see paper for full set These questions are meant as a starting point for inquiry, can add more questions where appropriate

“I would be creeped if my friends found me. And they said I saw you here. It would just be weird.” “for professors, it would be weird”

How does the unwanted disclosure take place? Is it an accident (for example, hitting the wrong button)? A misunderstanding (for example, the data sharer thinks they are doing one thing, but the system does another)? A malicious disclosure? • How much choice, control, and awareness do data sharers have over their personal information? What kinds of control and feedback mechanisms do data sharers have to give them choice, control, and awareness? Are these mechanisms simple and understandable? What is the privacy policy, and how is it communicated to data sharers? • What are the default settings? Are these defaults useful in preserving one’s privacy? • In what cases is it easier, more important, or more cost-effective to prevent unwanted disclosures and abuses? Detect disclosures and abuses? • Are there ways for data sharers to maintain plausible deniability? • What mechanisms for recourse or recovery are there if there is an unwanted disclosure or an abuse of personal information?

4.
Privacy Risk Model Analogy
Security Threat Model
“[T]he first rule of security analysis is this:
understand your threat model. Experience
teaches that if you don’t have a clear threat model
–
a clear idea of what you are trying to
prevent and what technical capabilities your
adversaries have – then you won’t be able to think
analytically about how to proceed. The threat
model is the starting point of any security
analysis.”
- Ed Felten

7.
Privacy Risk Analysis
Common Questions to Help Design Teams Identify
Risks
Social and Organizational Context
–
–
–
–
–
Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…

8.
Social and Organizational Context
Who are the users? Who shares info? Who sees it?
Different communities have different needs and norms
– An app appropriate for families might not be for work settings
Affects conditions and types of info willing to be shared
– Location information with spouse vs co-workers
– Real-time monitoring of one’s health
Start with most likely users
– Ex. Find Friends
– Likely sharers are people using mobile phone
– Likely observers are friends, family, co-workers
Find Friends

9.
Social and Organizational Context
What kinds of personal info are shared?
Different kinds of info have different risks and norms
– Current location vs home phone# vs hobbies
Some information already known between people
– Ex. Don’t need to protect identity with your friends and family
Different ways of protecting different kinds of info
– Ex. Can revoke access to location, cannot for birthday or name

11.
Social and Organizational Context
Value proposition for sharing personal information?
What incentive do users have for sharing?
Quotes from nurses using locator badges
– “I think this is disrespectful, demeaning and degrading”
– “At first, we hated it for various reasons, but mostly we felt we
couldn’t take a bathroom break without someone knowing
where we were…[but now] requests for medications go right to
the nurse and bedpans etc go to the techs first... I just love [the
locator system].”
When those who share personal info do not benefit in
proportion to perceived risks, then the tech is likely to fail

12.
Privacy Risk Analysis
Common Questions to Help Design Teams Identify
Risks
Social and Organizational Context
–
–
–
–
–
Who are the users?
What kinds of personal info are shared?
Relationships between sharers and observers?
Value proposition for sharing?
…
Technology
–
–
–
–
–
How is personal info collected?
Push or pull?
One-time or continuous?
Granularity of info?
…

13.
Technology
How is personal info collected?
Different technologies have different tradeoffs for privacy
Network-based approach
– Info captured and processed by external computers that users
have no practical control over
– Ex. Locator badges, Video cameras
Client-based approach
– Info captured and processed on end-user’s device
– Ex. GPS, beacons
– Stronger privacy guarantees, all info starts with you first

14.
Technology
Push or pull?
Push is when user sends info first
– Ex. you send your location info on E911 call
– Few people seem to have problems with push
Pull is when another person requests info first
E911
– Ex. a friend requests your current location
– Design space much harder here
need to make people aware of requests
want to provide understandable level of control
don’t want to overwhelm
Find Friends

18.
Privacy Risk Management
Helps teams prioritize and manage risks
First step is to prioritize risks by estimating:
– Likelihood that unwanted disclosure occurs
– Damage that will happen on such a disclosure
– Cost of adequate privacy protection
Focus on high likelihood, high damage, low cost risks first
– Like heuristic eval, fix high severity and/or low cost
– Difficult to get exact numbers, more important is the process

19.
Privacy Risk Management
Helps teams prioritize and manage risks
Next step is to help manage those risks
How does the disclosure happen?
– Accident? Bad user interface? Poor conceptual model?
– Malicious? Inside job? Scammers?
What kinds of choice, control, and awareness are there?
– Opt-in? Opt-out?
– What mechanisms? Ex. Buddy list, Invisible mode
– What are the default settings?
Better to prevent or to detect abuses?
– “Bob has asked for your location five times in the past hour”