Σχόλια 0

Το κείμενο του εγγράφου

Privacy technologies:An annotated syllabusArvind NarayananPrinceton UniversityAbstract.The teaching of graduate courses is symbiotic with research |it helps systematize knowl-edge,often creates new knowledge,and in uences the thinking of the next generation of scholars,indirectly shaping the future of the discipline.I therefore believe that this forum is well-suited to dis-cussing the teaching of privacy technologies.As a starting point,I oer my annotated syllabus andother re ections on the graduate seminar I taught at Princeton during Fall 2012.In developing the curriculum,I aimed to provide technical depth while integrating perspectives fromeconomics,law,policy and other schools of thought on privacy.These worldviews,in my opinion,some-times con ict with and often add much-needed nuance to the narrative on privacy technologies withincomputer science.As a rst-year faculty member,I'mkeenly aware that my experience as an educator is relatively limited;my intent here is to provoke discussion rather than to be authoritative.1 IntroductionDuring Fall 2012 I taught a graduate seminar course on privacy technologies at Princeton.1My conception ofthe term\privacy technologies"includes both privacy-enhancing and privacy-infringing technologies.Sincethis was a new course,I had freedom over both the content and the format.While this was decidedly a computer science course,I aimed to integrate and reconcile the computerscience literature the often divergent views on privacy and privacy technologies in the elds of economics,law,policy,philosophy,etc.This mirrors my view that privacy research as a whole stands to benet fromgreater interdisciplinarity.In terms of the format,my main departure from a traditional graduate seminar was to include an onlineWiki discussion component,and in fact to make this the centerpiece,and to require students to participatein the online discussion of the day's readings before coming to class.The in-class discussion would use theWiki discussion as a starting point.The advantages of this approach are:1.it gives the instructor a great degree of control in shaping thediscussion of each paper,2.the instructor can more closely monitor individual students'progress 3.classdiscussion can focus on particularly tricky and/or contentious points,instead of rehashing the obvious.It was a small class of eleven students,of which ten completed it.Rigorously evaluating the eectivenessof teaching is hard (student evaluations are of limited use without baselines or controls),so I do not reportany scientic results here.I only oer subjective thoughts.Section 2 is about refuting privacy myths.Section 3,the bulk of this document,is an annotated syllabus.I have made available the entire set of Wiki discussion prompts for the class.2I consider this integral to thesyllabus.Section 4 presents the broad thematic questions that I had in mind and my concluding thoughtsare in 5.1http://randomwalker.info/teaching/fall-2012-privacy-technologies/2http://randomwalker.info/teaching/fall-2012-privacy-technologies/discussion-prompts.htmlhttp://randomwalker.info/teaching/fall-2012-privacy-technologies/discussion-prompts.pdf2 Refuting privacy myths via\expectation failure"In this section I will discuss some major misconceptions about privacy,how to refute them,and why it isimportant to do this right at the beginning of the course.Privacy's primary pitfallsInstructors are often confronted with breaking down faulty mental models that students bring into class be-fore actual learning can happen.This is especially true of the topic at hand.Luckily,misconceptions aboutprivacy are so pervasive in the media and among the general public that it wasn't too hard to identify themost common ones before the start of the course.And it didn't take much class discussion to conrm thatmy students weren't somehow exempt from these beliefs.One cluster of myths is about the supposed lack of importance of privacy.1.\There is no privacy in thedigital age."This is the most common and perhaps the most grotesquely fallacious of the misconceptions;more on this below.2.\No one cares about privacy any more"(variant:young people don't care aboutprivacy.) 3.\If you haven't done anything wrong you have nothing to hide."A second cluster of fallacious beliefs is very common among computer scientists and comes fromthe tendencyto reduce everything to a black-and-white technical problem.In this view,privacy maps directly to accesscontrol and cryptography is the main technical mechanism for achieving privacy.It's a view in which theworld is full of adversaries and there is no room for obscurity or nontechnical ways of improving privacy.3The rst step in learning is to unlearnWhy is it important to spend time confronting faulty mental models?Why not simply teach the\right"ones?In my case,there was a particularly acute reason | to the extent that students believe that privacyis dead and that learning about privacy technologies is unimportant,they are not going to be invested inthe class,which would be really bad.But even in the case of misconceptions that don't lead to studentsdoubting the fundamental premise of the class,there is a surprising reason why unlearning is important.A famous experiment in the'80s demonstrated what we now know about the ineectiveness of the\informa-tion transmission"model of teaching [5].The researchers interviewed students after any of four introductoryphysics courses,and determined that they hadn't actually learned what had been taught,such as Newton'slaws of motion;instead they just learned to pass the tests.When the researchers sat down with students tond out why,here's what they found:What they heard astonished them:many of the students still refused to give up their mistaken ideas aboutmotion.Instead,they argued that the experiment they had just witnessed did not exactly apply to the law ofmotion in question;it was a special case,or it didn't quite t the mistaken theory or law that they held as true.A special case!Ha.What's going on here?Well,learning new facts is easy.On the other hand,updatingmental models is so cognitively expensive that we go to absurd lengths to avoid doing so.The societal-scaleanalog of this extreme reluctance is well-illustrated by the history of science | we patched the Ptolemaicmodel of the Universe,with the Earth at the center,for over a millennium before we were forced to acceptthat the Copernican system t observations better.The instructor's arsenal3For refutations of these myths,see [1],[2],[3],[4].The good news is that the instructor can utilize many eective strategies that fall under the umbrella ofactive learning.Ken Bain's excellent book (which the preceding text describing the experiment is from) laysout a pattern in which the instructor creates an expectation failure,a situation in which existing mentalmodels of reality will lead to faulty expectations.One of the prerequisites for this to work,according to thebook,is to get students to care.Bain argues that expectation failure,done right,can be so powerful that students might need emotional sup-port to cope.Fortunately,this wasn't necessary in my class,but I have no doubt of it based on my personalexperiences.For instance,back when I was in high school,learning how the Internet actually worked andrealizing that my intuitions about the network had to be discarded entirely was such a disturbing experiencethat I remember my feelings to this day.Let's look at an example of expectation failure in my privacy class.To refute the\privacy is dying"myth,Ifound it useful to talk about Fifty Shades of Grey | specically,why it succeeded even though publishersinitially passed on it.One answer seems to be that since it was rst self-published as an e-book,it allowedreaders to be discreet and avoid the stigma associated with the genre.(But following its runaway success inthat form,the stigma disappeared,and it was released in paper form and ew o the shelves.)The relative privacy of e-books from prying strangers is one of the many ways in which digital technologyaords more privacy for specic activities.Confronting students with an observed phenomenon whose ex-planation involves a fact that seems starkly contrary to the popular narrative creates an expectation failure.Telling personal stories about how technology has either improved or eroded privacy,and eliciting suchstories from students,gets them to care.Once this has been accomplished,it's productive to get into anuanced discussion of how to reconcile the two views with each other,dierent meanings of privacy (e.g.,tracking of reading habits),how the Internet has aected each,and how society is adjusting to the changingtechnological landscape.3 Privacy technologies:An annotated syllabusWhat should be taught in a class on privacy technologies?Before we answer that,let's take a step back andask,how does one go about guring out what should be taught in any class?I've seen two approaches.The traditional,default,overwhelmingly common approach is to think of it interms of\covering content"without much consideration to what students are getting out of it.The contentthat's deemed relevant is often determined by what the fashionable research areas happen to be,or historicalaccident,or some combination thereof.A contrasting approach,promoted by authors like Bain,applies a laser focus on skills that students willacquire and how they will apply them later in life.On teaching orientation day at Princeton,our instructor,who clearly subscribed to this approach,had each professor describe what students would do in the classthey are teaching,then wrote down only the verbs from these descriptions.The point was that our thinkinghad to be centered around skills that students would take home.I prefer a middle ground.It should be apparent from my description of the traditional approach above thatI'm not a fan.On the other hand,I have to wonder what skills our teaching coach would have suggested fora course on cosmology | avoiding falling into black holes?Alright,I'm exaggerating to make a point.Theverbs in question are words like\synthesize"and\evaluate,"so there would be no particular diculty inapplying them to cosmology.But my point is that in a cosmology course,I'm not sure the instructor shouldstart from these verbs.Sometimes we want students to be exposed to knowledge primarily because it is beautiful,and being able toperceive that beauty inspires us,instills us with a love of further learning,and I dare say satises a funda-mental need.To me a lot of the crypto\magic"that goes into privacy technologies falls into that category(not that it doesn't have practical applications).With that caveat,however,I agree with the emphasis on skills and life impact.I thought of my studentsprimarily as developers of privacy technologies (and more generally,of technological systems that incorporateprivacy considerations),but also as users and scholars of privacy technologies.I organized the course into sections,a short introductory section followed by ve sections that alter-nated in the level of math/technical depth.Every time we studied a technology,we also discussed its so-cial/economic/political aspects.I had a great deal of discretion in guiding where the conversation aroundthe papers went by giving them questions/prompts on the class Wiki.Let us now jump in.The italicizedtext is from the course page,the rest is my annotation.3.0 IntroGoals of this section:Why are we here?Who cares about privacy?What might the future look like? Dan Solove.Why Privacy Matters Even if You Have`Nothing to Hide'(Chronicle) [1] David Brin.The Transparent Society (WIRED,circa 1996 [6],later expanded into a book [7])In addition to helping esh out the foundational assumptions of this course that I discussed in the previoussection,pairing these opposing views with each other helped make the point that there are few absolutesin this class,that privacy scholars may disagree with each other,and that the instructor doesn't necessarilyagree with the viewpoints in the assigned reading,much less expects students to.3.1 Cryptography:power and limitationsGoals.Travel back in time to the 80s and early 90s,understand the often-euphoric vision that many cryptopioneers and hobbyists had for the impact it would have.Understand how cryptographic building blocks werethought to be able to support this restructuring of society.Reason about why it didn't happen.Understand the motivations and mathematical underpinnings of the modern research on privacy-preservingcomputations.Experiment with various encryption tools,discover usability problems and other limitations ofcrypto. David Chaum.Security without Identication:Card Computers to make Big Brother Obsolete (1985)[8] Steven Levy.Crypto Rebels (WIRED,1993 [9];later a 2001 book [10]) Eric Hughes.A cypherpunk's manifesto.(short essay,1993.)[11]I think the Chaum paper is a phenomenal and underutilized resource for teaching.My goal was to re-ally immerse students in an alternate reality where the legal underpinnings of commerce were replaced bycryptography,much as Chaum envisioned (and even going beyond that).I created a couple of e-commercescenarios for Wiki discussion and had them reason about how various functions would be accomplished.My own views on this topic are set forth in the paper (and talk)\What Happened to the Crypto Dream?"[12].In general I aimed to shield students from my viewpoints,and saw my role as helping them discover(and be able to defend) their own.At least in this instance I succeeded.Some students took the positionthat the cypherpunk dream is just around the corner. The`Garbled Circuit Protocol'(Yao's theorem on secure two-party computation) and its implications(lecture)This is one of the topics that sadly suers from a lack of good expository material,so I instead lectured onit. Alma Whitten and Doug Tygar.Why Johnny Can't Encrypt:A Usability Evaluation of PGP 5.0 [13] Nikita Borisov,Ian Goldberg,Eric Brewer.O-the-Record Communication,or,Why Not To Use PGP[14] Thomas Ptacek.Javascript Cryptography Considered Harmful [15]One of the exercises here was to install and use various crypto tools and rediscover the usability problems.The diculties were even worse than I'd anticipated.3.2 Data collection and data mining,economics of personal data,behavioral economics ofprivacyGoals.Jump forward in time to the present day and immerse ourselves in the world of ubiquitous datacollection and surveillance.Discover what kinds of data collection and data mining are going on,and why.Discuss how and why the conversation has shifted from Government surveillance to data collection by privatecompanies in the last 20 years.Theme:rst-party data collection. New York Times.How Companies Learn Your Secrets [16] Andrew Odlyzko.Privacy,Economics,and Price Discrimination on the Internet [17]Theme:third-party data collection. Julia Angwin.The Web's New Gold Mine:Your Secrets (First in the Wall Street Journal's What TheyKnow series) [18] Jonathan R.Mayer and John C.Mitchell.Third-Party Web Tracking:Policy and Technology [19]Theme:why companies act the way they do. Joseph Bonneau and Sren Preibusch.The Privacy Jungle:On the Market for Data Protection in SocialNetworks [20] Bruce Schneier.How Security Companies Sucker Us With Lemons (WIRED) [21]Theme:why people act the way they do. Alessandro Acquisti and Jens Grossklags.What Can Behavioral Economics Teach Us About Privacy?[22] Alessandro Acquisti.Privacy in Electronic Commerce and the Economics of Immediate Gratication [23]This section is rather self-explanatory.After the math-y avor of the rst section,this one has a good amountof economics,behavioral economics,and policy.One of the thought exercises was to project current trendsinto the future and imagine what ubiquitous tracking might lead to in ve or ten years.3.3 Anonymity and De-anonymizationImportant note:communications anonymity (e.g.,Tor) and data anonymity/de-anonymization (e.g.,iden-tifying people in digital databases) are technically very dierent,but we will discuss them together becausethey raise some of the same ethical questions.Also,Bitcoin lies somewhere in between the two. Roger Dingledine,Nick Mathewson,Paul Syverson.Tor:The Second-Generation Onion Router) [24] Satoshi Nakamoto.Bitcoin:A Peer-to-Peer Electronic Cash System [25]Tor and Bitcoin (especially the latter) were the hardest but also the most rewarding parts of the class,bothfor them and for me.Together they took up 4 classes.Bitcoin is extremely challenging to teach because it istechnically intricate,the ecosystemis rapidly changing,and a lot of the information is in randomblog/forumposts.In a way,I was betting on Bitcoin by deciding to teach it |if it had died with a whimper,their knowledgeof it would be much less relevant.In general I think instructors should choose to make these such bets moreoften;most curricula are very conservative.I'm glad I did. Nils Homer at al.Resolving Individuals Contributing Trace Amounts of DNA to Highly Complex MixturesUsing High-Density SNP Genotyping Microarrays [26] [Optional] Arvind Narayanan,Elaine Shi,Benjamin I.P.Rubinstein.Link Prediction by De-anonymization:How We Won the Kaggle Social Network Challenge [27]It was a challenge to gure out which deanonymization paper to assign.I went with the DNA one because Iwanted them to see that deanonymization isn't a fact about data,but a fact about the world.Another thingI liked about this paper is that they'd have to extract the not-too-complex statistical methodology in thispaper from the bioinformatics discussion in which it is embedded.This didn't go as well as I'd hoped.I've co-authored a few deanonymization papers,but they're not very well written and/or are poorly suitedfor pedagogical purposes.The Kaggle paper is one exception,which I made optional. Paul Ohm.Broken Promises of Privacy:Responding to the Surprising Failure of Anonymization [28] [Optional] Jane Yakowitz Bambauer.Tragedy of the Data Commons [29]This is another pair of papers with opposing views.Since the latter paper is optional,knowing that most ofthem wouldn't have read it,I used the Wiki prompts to raise many of the issues that the author raises.3.4 Lightweight Privacy Technologies and New Approaches to Information PrivacyWhile cryptography is the mechanism of choice for cypherpunk privacy and anonymity tools like Tor,it istoo heavy a weapon in other contexts like social networking.In the latter context,it's not so much usersdeploying privacy tools to protect themselves against all-powerful adversaries but rather a service providerattempting to cater to a more nuanced understanding of privacy that users bring to the system.The goalof this section is to consider a diverse spectrum of ideas applicable to this latter scenario that have beenproposed in recent years in the elds of CS,HCI,law,and more.The technologies here are\lightweight"incomparison to cryptographic tools like Tor. Scott Lederer,Jason Hong et al.Personal Privacy through Understanding and Action:Five Pitfalls forDesigners [30] Franziska Roesner et al.User-Driven Access Control:Rethinking Permission Granting in Modern Oper-ating Systems [31] Fred Stutzman and Woodrow Hartzog.Obscurity by Design:An Approach to Building Privacy into SocialMedia [32] Woodrow Hartzog and Fred Stutzman.The Case for Online Obscurity [2] Jerry Kang et al.Self-surveillance Privacy [33] [Optional] Ryan Calo.Against Notice Skepticism In Privacy (And Elsewhere) [34] Helen Nissenbaum.A Contextual Approach to Privacy Online [35]3.5 Purely technological approaches revisitedThis nal section doesn't have a coherent theme (and I admitted as much in class).My goal with the rsttwo papers was to contrast a privacy problem which seems amenable to a purely or primarily technologicalformulation and solution (statistical queries over databases of sensitive personal information) with one wheresuch attempts have been less successful (the decentralized,own-your-data approach to social networking ande-commerce). Dierential Privacy.(Lecture) Cynthia Dwork.Dierential Privacy.[36]Dierential privacy is another topic that is sorely lacking in expository material,especially from the pointof view of students who've never done crypto before.So this was again a lecture. Arvind Narayanan et al.A Critical Look at Decentralized Personal Data Architectures [37] John Perry Barlow A Declaration of the Independence of Cyberspace (short essay,1996) [38] James Grimmelmann.Sealand,HavenCo,and the Rule of Law [39]These two essays aren't directly related to privacy.One of the recurring threads in this course is the debatebetween purely technological and legal or other approaches to privacy;the theme here is to generalize it to acontext broader than privacy.The Barlow essay asserts the exceptionalism of Cyberspace as an unregulablemedium,whereas the Grimmelmann paper provides a much more nuanced view of the relationship betweenthe law and new technological frontiers.I have made available the entire set of Wiki discussion prompts for the class.4I consider thisintegral to the syllabus,for it shapes the discussion very signicantly.I really hope other instructors andstudents nd this useful as a teaching/study guide.For reference,each set of prompts (one set per class)took me about three hours to write on average.4 ThemesHere are the\big questions"and themes of the course,including those I discussed as part of the coursecontents:{ Who cares about privacy?Does privacy matter even if we have\nothing to hide?"{ How do values about privacy change over time and across cultures?{ What would a world of perpetual observation look like,technologically and socially?{ What's the dierence,and the relationship,between security and privacy?{ Are privacy technologies about cryptography and access control,or is there more to them?{ What might an alternative world with ubiquitous crypto have looked like?Why are we not in that world?{ How does anonymity relate to privacy?How does crypto enable anonymity?What factors make it easieror harder to achieve?{ How can we classify privacy technologies in terms of the underlying techniques?{ How can we classify privacy technologies in terms of who they are intended for,and the incentives forusage?{ How can we go beyond the public/private dichotomy in privacy?{ What framework should we use for thinking about privacy technologies that have both good and baduses?4http://randomwalker.info/teaching/fall-2012-privacy-technologies/discussion-prompts.htmlhttp://randomwalker.info/teaching/fall-2012-privacy-technologies/discussion-prompts.pdf{ How much can be inferred or predicted about people's actions,behavior and preferenes via machinelearning?Are there limits to these inferences and predictions?{ What are the economic incentives that make companies act the way they do?{ What are the biases and heuristics that make people act the way they do?{ How does better design and HCI contribute to improved privacy?{ What does the story of privacy technologies tell us about the relationship between technology and society?{ How do we nd a balance,in various contexts,between regulating by technology,by law,by social norms,and by other means?What are the strengths of each approach?{ Should academics who study privacy have a normative stance on it?Is it acceptable for this stance tobe re ected in scholarly work?{ What are some incentives that academics from dierent disciplines have that aect the type of privacyscholarship that they do and the type of conclusions they reach?5 DiscussionStudent feedback was predominantly positive.Essentially the only change that students wished for at theend of the course was more technical material.This was a surprise to me,since the evidence seemed tosuggest that students were having diculty with the more technical aspects |in the Wiki discussions thatincorporated a mix of technical and less technical questions,the latter generally had more participation.Onemodel of student behavior that might explain this is that they follow a work-minimization strategy duringthe course,but looking back at the end,wish it were structured so that they were forced to do more technicalwork.Most of the students had only a passing familiarity with cryptography.I was pleasantly surprised by how farit is possible to go in teaching privacy technologies while building up the crypto background as necessary.While cryptography is obviously a key building block of many privacy technologies,perhaps it need not bea pedagogical prerequisite.For several topics | Yao's garbled circuit protocol,Bitcoin,and dierential privacy | I did not nd goodexpository material online.Developing these would be valuable to the community,in my opinion.Perhapsthis could even be a part of the activity of the class.Partly motivated by my teaching experience,there isnow a nascent eort in the security group at Princeton to write a survey paper on Bitcoin.Finally,I'm happy to report that one of the class projects has led to novel research ndings [40].Acknowledgement.I'm very grateful to Vitaly Shmatikov for feedback on the syllabus.Joseph LorenzoHall's privacy syllabus for his course at NYU was also very useful.5References1.D.J.Solove,\Why privacy matters even if you have`nothing to hide',"Chronicle of Higher Education,vol.15,2011.2.W.Hartzog and F.Stutzman,\The case for online obscurity."3.A.Narayanan,\Adversarial thinking considered harmful (sometimes)."4.||,\The many ways in which the internet has given us more privacy."5.K.Bain,What the Best College Teachers Do.Harvard University Press,2004.6.D.Brin,\The transparent society.Wired,"1996.7.||,The transparent society:Will technology force us to choose between privacy and freedom?Basic Books,1999.5http://josephhall.org/papers/NYU-MCC-1303-S2012_privacy_syllabus.pdf8.D.Chaum,\Security without identication:Transaction systems to make big brother obsolete,"Communicationsof the ACM,vol.28,no.10,pp.1030{1044,1985.9.S.Levy,\Crypto rebels.Wired,"1996.10.||,Crypto:How the Code Rebels Beat the Government{Saving Privacy in the Digital Age.Penguin Books,2001.11.E.Hughes,\A cypherpunk's manifesto,"1993.12.A.Narayanan,\What happened to the crypto dream?"Security & Privacy,IEEE,vol.11,no.2,pp.75{76,2013.13.A.Whitten and J.D.Tygar,\Why johnny can't encrypt:A usability evaluation of PGP 5.0,"in Proceedings ofthe 8th USENIX Security Symposium,vol.99.McGraw-Hill,1999.14.N.Borisov,I.Goldberg,and E.Brewer,\O-the-record communication,or,why not to use PGP,"in Proceedingsof the 2004 ACM workshop on Privacy in the electronic society.ACM,2004,pp.77{84.15.T.Ptacek,\Javascript cryptography considered harmful,"http://www.matasano.com/articles/javascript-cryptography/.16.C.Duhigg,\How companies learn your secrets,"The New York Times,vol.2,p.16,2012.17.A.Odlyzko,\Privacy,economics,and price discrimination on the internet,"in Proceedings of the 5th internationalconference on Electronic commerce.ACM,2003,pp.355{366.18.J.Angwin,\The web's new gold mine:Your secrets,"Wall Street Journal,vol.30,no.07,p.2010,2010.19.J.R.Mayer and J.C.Mitchell,\Third-party web tracking:Policy and technology,"in Security and Privacy (SP),2012 IEEE Symposium on.IEEE,2012,pp.413{427.20.J.Bonneau and S.Preibusch,\The privacy jungle:On the market for data protection in social networks,"inEconomics of information security and privacy.Springer,2010,pp.121{167.21.B.Schneier,\How security companies sucker us with lemons.Wired,"2007.22.A.Acquisti and J.Grossklags,\What can behavioral economics teach us about privacy?"DIGITAL PRIVACY,p.329,2007.23.A.Acquisti,\Privacy in electronic commerce and the economics of immediate gratication,"in Proceedings ofthe 5th ACM conference on Electronic commerce.ACM,2004,pp.21{29.24.R.Dingledine,N.Mathewson,and P.Syverson,\Tor:The second-generation onion router,"DTIC Document,Tech.Rep.,2004.25.S.Nakamoto,\Bitcoin:A peer-to-peer electronic cash system,"Consulted,vol.1,p.2012,2008.26.N.Homer,S.Szelinger,M.Redman,D.Duggan,W.Tembe,J.Muehling,J.V.Pearson,D.A.Stephan,S.F.Nelson,and D.W.Craig,\Resolving individuals contributing trace amounts of DNA to highly complex mixturesusing high-density SNP genotyping microarrays,"PLoS genetics,vol.4,no.8,p.e1000167,2008.27.A.Narayanan,E.Shi,and B.I.Rubinstein,\Link prediction by de-anonymization:How we won the kaggle socialnetwork challenge,"in Neural Networks (IJCNN),The 2011 International Joint Conference on.IEEE,2011,pp.1825{1834.28.P.Ohm,\Broken promises of privacy:Responding to the surprising failure of anonymization,"UCLA Law Review,vol.57,p.1701,2010.29.J.Yakowitz,\Tragedy of the data commons,"2011.30.S.Lederer,I.Hong,K.Dey,and A.Landay,\Personal privacy through understanding and action:ve pitfallsfor designers,"Personal and Ubiquitous Computing,vol.8,no.6,pp.440{454,2004.31.F.Roesner,T.Kohno,A.Moshchuk,B.Parno,H.J.Wang,and C.Cowan,\User-driven access control:Rethinkingpermission granting in modern operating systems,"in Security and Privacy (SP),2012 IEEE Symposium on.IEEE,2012,pp.224{238.32.F.Stutzman and W.Hartzog,\Obscurity by design:An approach to building privacy into social media."33.J.Kang,K.Shilton,D.Estrin,and J.Burke,\Self-surveillance privacy,"Iowa L.Rev.,vol.97,p.809,2011.34.R.Calo,\Against notice skepticism in privacy (and elsewhere),"Notre Dame Law Review,vol.87,2012.35.H.Nissenbaum,\A contextual approach to privacy online,"Daedalus,vol.140,no.4,pp.32{48,2011.36.C.Dwork,\Dierential privacy,"in Automata,languages and programming.Springer,2006,pp.1{12.37.A.Narayanan,V.Toubiana,S.Barocas,H.Nissenbaum,and D.Boneh,\A critical look at decentralized personaldata architectures,"arXiv preprint arXiv:1202.4503,2012.38.J.P.Barlow,\A declaration of the independence of cyberspace,"1996.39.J.Grimmelmann,\Sealand,havenco,and the rule of law,"University of Illinois Law Review,p.405,2012.40.C.Eubank,M.Melara,D.Perez-Botero,and A.Narayanan,\Shining the oodlights on mobile web tracking |a privacy survey,"in Web 2.0 Security and Privacy Workshop,2013.