Comments 0

Document transcript

Modeling and Reasoning Upon FacebookPrivacy SettingsMathieu d'Aquin and Keerthi ThomasKnowledge Media Institute,The Open University,Milton Keynes,UKfmathieu.daquin,keerthi.thomasg@open.ac.ukAbstract.Understanding the way information is propagated and madevisible on Facebook is a dicult task.The privacy settings and the rulesthat apply to individual items are reasonably straightforward.However,for the user to track all of the information that needs to be integratedand the inferences that can be made on their posts is complex,to theextent that it is almost impossible for any individual to achieve.In thisdemonstration,we investigate the use of knowledge modeling and rea-soning techniques (including basic ontological representation,rules andepistemic logics) to make these inferences explicit to the user.1 IntroductionThe notion of social translucence (as dened in [4]) concerns the design of sys-tems with a social process component,to achieve coherent behaviours from theuser(s) through making such behaviours visible and understandable to them.This notion is especially relevant in relation to privacy,where the principlesof visibility,awareness and accountability promoted by social translucence areused to enable a coherent and informed behaviour from the users with respectto the distribution and propagation of their personal information.This idea iswell illustrated in the notion of\Privacy Mirrors",i.e.,systems that integrate thenecessary tools of\awareness and control to understand and shape the behaviourof the system"[6].While these notions might appear to naturally apply to social networkingsystems such as Facebook1,their privacy settings and of the mechanisms for in-formation sharing they implement are only deceptively simple:for an individualuser to keep track of all the necessary elements to understand what informationothers might have access to,and what inferences they might derive from it,isactually too complex to be achieved.For example,while individual photos havespecic privacy scopes,the tagging,comments,likes,geographical informationattached to them can make much more information about the user availableto much more people than the user might intend,without the user's ability tounderstand the full scope of the implications of such sharing and tagging.In this demonstration,we show how a privacy mirror for Facebook can beimplemented using knowledge modeling and reasoning techniques,to make ex-plicit to the user some of the inferences that can be made out of informationavailable about them on the social platform.We use basic ontology modeling,rules and a simplication of the basic concepts of epistemic logics.1http://facebook.com2 Information from FacebookIn this demonstration,to simplify the discussion,we focus on information aboutphotos,especially the ones (explicitly) referring to the user.However,the basicnotions and approach described apply similarly to other types of information.The basic concepts extracted using the Facebook Graph API2concern people(users),photos,comments,places and dates.Individuals (variables and con-stants) therefore represent instances of these concepts.Predicates represent rela-tionships.For example,users can be friends with each-other (friend(bob;alice)),a photo can be at a place (photoAt(photo1;segovia)),at a certain date(date(photo1;08  07  2013)) and include some users (onPhoto(photo1;bob)).Finally,any post including photos have a privacy scope which could be everyone,friendoffriend,allfriends,custom (e.g.,scope(photo1;allfriends)).3 Basic ontological modeling and reasoningFrom the explicit data extracted from Facebook,basic information can be in-ferred using ontology-based mechanisms.For example,including range and do-main information associated with the predicates mentioned above can help iden-tifying types of objects (e.g.,friend(bob;alice) implies that person(bob) andperson(alice)).Similarly,using constructs available in OWL 2,the friend pred-icate can be declared to be reciprocal (as it is in Facebook):friend(bob;alice)implies friend(alice;bob).Property hierarchies can also be used to introduce intermediary predicates,more abstract than the notions explicitly available in Facebook.For example,declaring friend as a sub-property of know (so that friend(bob;alice) impliesknow(bob;alice)).The same mechanisms,combined with the property composi-tion construct available in OWL 2,can be used to represent much more com-plex inferences (e.g.,that if two people are on the same photo,they know eachother).However,for convenience,we choose to use rules (which can also be usedfor other types of inferences not feasible with basic OWL constructs) for suchcomplex implications.4 Rule-based inferenceAs mentioned above,some more complex inferences need to be represented thatare not conveniently achieved with ontological constructs.This includes informa-tion such that being on a picture,geotagged with a certain place,implies that theuser was at that place (wasIn(Per;Pl):- onPhoto(Pic;Per);photoAt(Pic;Pl))or that two users on the same photo know each other (know(Per1;Per2):- onPhoto(Pic;Per1);onPhoto(Pic;Per2)),and possibly that they were at thesame place.2https://developers.facebook.com/docs/reference/api/5 Epistemic inferenceThe mechanisms described above make it possible for the model to explicitlymake the inferences possible from the information being shared.However,theimportant aspect here is not only which inferences can be made,but who canmake them.To address this,we use notions from epistemic logics [5].Indeed,epistemic logics are a type of logic that allows one to express not only statementsabout the world,but also about the way the world is perceived or known byagents in the world.In such a logic,a statement of the form Ka indicates thatthe agent a`knows'the statement  to be true.Basic properties,such as the oneof self re ection (i.e.,Ka!KaKa) and rules can be used to reason uponthe knowledge agents have of some information.This framework,combined with information about the privacy settings ofFacebook,allows us to express information regarding which user might have ac-cess to what item of information.Straightforwardly for example,information onwho knows about a photo can be derived from the privacy scope of the photo(e.g.,Kaphoto(Pic):- author(Pic;Per);scope(Pic;allfriends);friend(Per;a)).More complex mechanisms are also represented using this type of rules however,for example that the friends of somebody tagged in a photo would know aboutthe photo,or that knowing about a photo implies knowing all the informationattached to a photo and the possible inferences that can be made from them(e.g.,that somebody was in a certain place with somebody else).6 ImplementationFig.1.Screenshots of the system making explicit privacy inferences in Facebook.The implementation of the system showing to a user the inferences that canbe made from information sharing items concerning them (currently focusing onphotos) and by who is a Web-based interface built in PHP and Javascript,thatallows the user to connect to their Facebook account and extract the relevantinformation.The knowledge representation and reasoning mechanisms describedabove is delegated to a Prolog-based API,carrying out the ontological reasoning(through a basic mapping between OWL and Prolog),the rule-based reasoningand a simplied implementation of epistemic rules described in the previoussections.As shown in the screenshots of Figure 1,the system displays the inferredinformation to the user:1- the people they are friend with,the ones they know(without being friends) and the people the user might not know,but who mighthave access to some of their information;2- the photos depicting the user;3- theplaces where the user have been (who with and on what date).Clicking on aperson (as shown on Figure 1) displays information about items shared by thisuser,as well as the information they know about the logged-in user.Clicking onan item displays the information that can be inferred from this item,and thepeople who might have access to these inferences.7 ConclusionOur goal in this demonstration is to show that knowledge modeling and reason-ing techniques can support the notion of privacy mirrors,in systems where theprivacy implications of information sharing are complex and dicult for a user tokeep track of.In the demonstration,participants will be able to connect the sys-tem to their own Facebook account,to check whether the results are surprising,concerning or on the contrary,just reassuring (which are the types of reactionswe uncovered in another study [3]).In terms of future work,besides completingand validating the modeling of Facebook's privacy mechanisms (which can bea complex task),one of the interesting research directions is to integrate themodel of Facebook with other sources of personal information sharing (using forexample techniques described in some of our previous works,e.g.,[1,2]).Theother direction we plan to investigate is the use of more sophisticated knowledgerepresentation techniques to deal with the complexity of online social situations,including uncertainty and dierent levels of epistemic knowledge of information(e.g.,having access to information vs.having surely seen a piece of information).References1.M.dAquin,S.Elahi,and E.Motta.Personal monitoring of web information ex-change:Towards web lifelogging.Web Science,2010.2.M.d'Aquin,S.Elahi,and E.Motta.Semantic technologies to support the user-centric analysis of activity data.In Social Data on the Web (SDoW) workshop atISWC,2011.3.M.d'Aquin and K.Thomas.Consumer activity data:Usages and challenges.Knowl-edge Media Institute,Tech.Report kmi-12-03,2012.4.T.Erickson and W.A.Kellogg.Social translucence:an approach to designingsystems that support social processes.ACM transactions on computer-human in-teraction (TOCHI),7(1):59{83,2000.5.J.-J.Ch.Meyer and W.van der Hoek.Epistemic Logic for AI and ComputerScience.Cambridge University Press,2004.6.E.D.Nguyen,D.H.;Mynatt.Understanding and shaping socio-technical ubiqui-tous computing systems.GVU Technical Report;GIT-GVU-02-16,2002.