,which represents thesequence of social links that constitute the selected path frompito pjin the SIoT.At this point,the Trustworthiness Man-agement component is expected to provide the key function oflisting the trust level of any node in Zh.This is the objectiveof our work.IV.SUBJECTIVE TRUST MANAGEMENT MODELA.Basic elementsIn the above scenario,we envision a subjective trustwor-thiness model,where each node picomputes the trustwor-thiness of its Nifriends on the basis of its own experienceand on the opinion of the Kijfriends in common.We referto this trustworthiness with Tij,i.e.,the trustworthiness ofnode pjseen by node pi.If piand pjare not friends thenthe trustworthiness is calculated by word of mouth througha chain of friendships.A node trustworthiness is determinedthrough evaluation of its behaviour performed by the nodes inthe network that interacted with it.Such reputation reﬂectsthe degree of trust that other nodes in the social networkhave on the given node on the basis of their past direct(direct interactions) or indirect (through intermediate nodes)experiences.To this we identify major important factors thathave been derived by similar ones used in P2P networkstrustworthiness algorithms:•A feedback system allows a node pito provide anevaluation of the service it has received by the providerpj.Feedback is represented by flij,which refers to eachtransaction l and can be expressed either in a binary way(flij∈ {0,1},i.e.,pirates 1 if it is satisﬁed by the serviceand 0 otherwise),or using values in a continuous range(flij∈ [0,1]) to evaluate different levels of satisfaction.•The total number of transactions between two nodes,indicated by Nij,that enables the model to detect if twonodes piand pjhave an abnormally high number oftransactions.19•The credibility of node pi,indicated as Cij,representsa key factor in evaluating the information (feedbackand trust level) provided by the nodes.It can assumethe values in the range [0,1] where 1 represents fullcredibility for the node.•The transaction factor ωlijindicates the relevance oftransaction l between node piand node pj.It is usedto discriminate important transactions,ωlij= 1,fromirrelevant ones,ωlij= 0,and can be used as a weightfor the feedback.This parameter avoids nodes to buildup their trustworthiness on small transactions and thenmaliciously behave for an important one.In addition itcan be used to discriminate the transactions and considertrusted a node only for a certain type of service.To the above,we add other two key factors that exploitthe main features of the social network among objects:•The relationship factor Fijindicates the type of rela-tion that connects piwith pjand represents a uniquecharacteristic of the SIoT.It is useful to either mitigateor enhance the information provided by a single friend.Table I shows the values of the relationship factor forevery relation type,where higher values indicate highertrustworthiness.This is a possible setting that we usein this paper on the basis of the following reasoning.However,alternative values can be used if justiﬁed bydifferent principles.Between two objects that belongto the same owner and then are linked by a OOR,itis very unlikely to ﬁnd a malicious node and for thisreason the highest factor value is assigned to this kind ofrelationship.Similar reasoning has been followed for theCLOR and the CWOR cases,since they are establishedbetween domestic objects or objects of the same work-place,respectively.SORs are relationships establishedbetween objects that are encountered occasionally and forthis reason are associated to a smaller factor.Finally,thePOR are the most risky,since they are created betweenobjects of the same brand but that never met and dependonly on the model object.If two nodes are tied by two ormore relationships,the strongest relation with the highestfactor is considered.•The centrality of node pi,indicated as Rij(with respectto pj).It provides a peculiar information of the socialnetwork since if a node has many relationships or isinvolved in many transactions,it is expected to assumea central role in the network.As described in [14],cen-trality is “related to group efﬁciency in problem-solving,perception of leadership and the personal satisfaction ofparticipants”.A further important characteristic of the IoT members isalso considered:•The computation capabilities of an object,namely itsintelligence Ij.It is a static characteristic of the objectsince it does not vary over the time but depends on thetype of the object considered only.Indeed,we expectthat a smart object has more capabilities to cheat withTABLE ITRANSACTIONFACTOROwnership Object Relationship OOR0.9Co-Location Object Relationship CLOR0.8Co-Work Object Relationship CWOR0.8Social Object Relationship SOR0.6Parental Object Relationship POR0.5TABLE IICOMPUTATION CAPABILITIESClass 1 Smartphone,tablet0.8Class 2 Set top box,smart video camera0.6Class 3 Sensor0.4Class 4 RFID0.2respect to a “dummy” object,and this leads to riskiertransactions.Accordingly,we identify four different classof objects,where each class is deﬁned on the basis ofthe computation capabilities,and assign to each class adifferent value,as shown in Table II:Class1 is assignedto mobile objects with great computational and commu-nication capabilities,such as smartphones,tablets,andvehicle control units;Class2 is assigned to static objectswith signiﬁcant computing capabilities;objects such asdisplays,set top boxes,smart video cameras belong tothis class;Class3 is assigned to objects with only sensingcapabilities,that is,any object capable of providing ameasure of the environment status.Finally,Class4 isassigned to the RFID-tagged objects.B.Subjective TrustworthinessIn this approach,each node stores and manages thefeedback needed to locally calculate the level of trustwor-thiness.This is intended to avoid single points of failuresand infringement of the values of trustworthiness.We ﬁrstdescribe the scenario of node piand pjadjacent,i.e.whenthey share a social relationship,and we deﬁne Tij,namelythe trustworthiness of node pjseen by pi,as followsTij= αRij+βIj+γOdirij+δOindij(1)where α,β,γ,and δ are used to give different weight tothe different terms in the above sum,and they are such thatα +β +γ +δ = 1 in order to keep the trustworthiness valuebetween 0 and 1.In eq.(1) it is clear that node picomputesthe trustworthiness of its friends on the basis of their centralityRij,of their intelligence Ij,of its own direct experience,Odirij,and on the opinion of the Kijcommon friends with node pj,Oindij.In this context,the centrality of pjis deﬁned as followsRij= |Kij|

|Ni| (2)and represents how much the node pjis central in the “life”of pi.This aspect helps to prevent malicious nodes that buildup a lot of relationships to have high values of centrality.If20two nodes have a lot of friends in common,this means theyhave similar evaluation parameters about building relation-ships,even more if we consider the possibility to terminatea relationship with a very low value of trustworthiness.When a node pineeds information about the trustworthi-ness of a node pj,it checks the last direct transactions anddetermines its own opinion as described in the followingOdirij=⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩Fijif Nij= 0

log(Nij+1)1 +log(Nij+1)

(Olonij+χOrecij)++

11 +log(Nij+1)

Fijif Nij> 0(3)In eq.(3) two opinions are calculated,using different sizesfor the temporal windows of observation:Olonfor the long-term opinion and Orecfor the short-term opinion.Also in thiscase two different weights are deﬁned for the long and shortterm opinion,that is and χ such that +χ = 1.It is important to note how,even if no transactions areavailable for node pito judge the node pj(Nij= 0),a ﬁrstevaluation has been obtained considering the type of relationthat links the two nodes.When other information becomesavailable from the transactions between piand pj(Nij>0),the relationship factor starts to lose its importance andeventually only the opinion built up with past transactions isconsidered.The long and short-term opinions needed in eq.(3) aredeﬁned asOlonij=Llon

l=1flijωlij

Llon

l=1ωlij(4)Orecij=Lrec

l=1flijωlij

Lrec

l=1ωlij(5)where Llonrepresents the temporal window for the long-term opinion and Lrecthe is the analogous for the short-term opinion,with Llon> Lrecand l indexes from the latesttransaction to the oldest ones.Moreover,the transaction factorωijis used to weight the feedback so to distinguish importanttransactions from unimportant ones.Indeed,the short-termopinion is useful when evaluating the risk associated witha node,i.e.,the possibility for a node to start acting ina malicious way or oscillating around a regime value afterbuilding up its reputation.It makes possible to suddenly spoilthe service requesting nodes.In fact,the long-term opinion isnot sensitive enough to detect this scenario since it needs along time to change the accumulative score.The indirect opinion can be expressed asOindij=|Kij|

k=1

CikOdirkj

|Kij|

k=1Cik(6)where each of the Kijfriends in common gives its own directopinion of the node pj.All these opinions are then weightedby pi,based on the credibility Cikof the node that providesit.The credibility is calculated asCik= ηOdirik+μRik+ρ(1 −Ik) (7)where η+μ+ρ = 1.From (7) we see that Cikdepends on thedirect experience between the two nodes,on their centralityand on their intelligence.Its computation requires adjacentnodes to exchange information on their direct opinions andlist of friends,which may be an issue.To reduce the trafﬁcload,it is possible for node pito request the indirect opiniononly to those nodes with a high credibility value.Eqs.(2) - (7) allow us to ﬁnally compute the subjectivetrustworthiness in (1).Indeed,for the idea itself of subjectivetrustworthiness,all the formula we have shown in this sectionare not symmetric and in general Tij

= Tji.If the node that requests the service piand the node thatprovides it pjare not close,i.e.are not in a direct relationship,then the computation of all the trustworthiness values can bedone by multiplying all the trustworthiness values betweenadjacent nodes in the considered route from the requester tothe provider,that isT

ij=j−1

d=iTd,d+1(8)At the end of each transaction,piassigns a feedback flijto the service received.In the case of the adjacent nodes piand pj,pidirectly assigns a feedback flijto pjand also to thefriends in Kijthat have contributed to the calculation of thetrustworthiness by providing Odirikaccording to the followingflik=

flijif Odirkj≥ 0.51 −flijif Odirkj< 0.5(9)The reference node pkreceives a feedback according tothe opinion value it suggested to pi,to reward/penalize it forits advice.In case of more than one degree of separation,the intermediary nodes can propagate the feedback up tothe provider,only if the previous node,i.e.the node thatpropagates the feedback,has a credibility greater than athreshold.V.EXPERIMENTALEVALUATIONA.Simulation SetupTo conduct our performance analysis,we would needmobility traces of a large number of objects.Since this datais not available to date,we resorted on the mobility modelcalled Small World In Motion (SWIM) [15] to generate theneeded traces.The idea behind the use of SWIM lies in itsability to accurately match the social behavior of humansbeings like it has been proven to happen when using themost popular mobility traces available in CRAWDAD [16].However,the output of the SWIM model is a trace of theposition of humans.We then assume that each user owns a set21TABLE IIISETTING OF WEIGHTS DURING SIMULATIONSParameterDescriptionValueαweight of the centrality0.15βweight of the object characteristic0.15γweight of the direct opinion0.4δweight of the indirect opinion0.3

weight of the long-term opinion0.5χweight of the short-term opinion0.5ηweight of the direct opinion in the credibility0.7μweight of the centrality in the credibility0.15ρweight of the intelligence in the credibility0.15of things that are connected to the SIoT and that during anymovement the user carries half of these objects and leaves theothers at home.Objects that stay at home create co-locationrelationships.Every node belongs to a speciﬁc model,so thatobjects of the same model share a parental object relationship.The other relationships are created on the basis of the ownersmovements.We run the experiment with 800 nodes (by default),considering that each person possesses an average of 7 objects.Two different behaviors can be considered in a social network:one is always benevolent and cooperative so that we callthe relevant node social nodes.The other one is a strategicbehavior corresponding to an opportunistic participant whocheats whenever it is advantageous for it to do so;we call therelevant node malicious nodes.The percentage of maliciousnodes is denoted by mp and it is set by default to mp = 25%.Malicious node behaviors can be collusive or non-collusive.Anode with a non-collusive behavior provides bad services andfalse feedback;it can occasionally choose to cooperate in orderto confuse the network.We denote with mr the percentageof time in which these nodes behave maliciously (by defaultmr = 100%).In a collusive environment,malicious nodes cre-ate groups that cooperate to grow each other trustworthiness;we suppose,for simplicity,that a group of malicious nodes isidentiﬁed by nodes tied with a OOR,so that for mp = 25%the number of collusive groups is set to 32 groups.At the startof each transaction,the simulator chooses randomly the noderequesting the service,and a certain percentage of nodes thatcan provide the service.The response percentage is denotedby res and is set to 5%.The malicious node can then be thenode requesting the service,the node providing the service orthe node providing its opinion about another node.Table IIIshows the values for the weights that have been used duringsimulations.Additionally,the number of transaction in thelong-term (Llon) and short-term opinions (Lrec) have beenset to 50 and 5,respectively.Finally,each object randomlybelongs to one of the computation capabilities classes.After a node chooses the provider of the service on thebasis of the highest computed trustworthiness level,it sendsthe service request to it.Depending on how the SIoT modelis implemented,the service can be delivered either throughthe nodes that discover the service,i.e.,the social network is100020003000400050006000700080000.50.550.60.650.70.750.80.850.90.951Transaction numberSuccess rate

No TrustCollusive/OverlayCollusive/No−overlayNoncollusive/OverlayNoncollusive/No−overlayFig.1.Transaction success ratealso used to transmit the service requests and responses on topof the existing transport network (overlay structure) or hop-by-hop trough the existing communication network,i.e.,therequester and the provider directly communicate (non-overlaystructure).In the latter case,a malicious node can alter theservice only if it is the provider.In the ﬁrst case,a maliciousnode can interfere with the deliver of the service even if it isin the route from pito pj.B.Simulation ResultsWe deﬁne the transaction success rate as the ratio betweenthe number of successful transactions and the total number oftransactions.Fig.(1) shows the success rate in non-collusiveand collusive scenarios while using and not using an overlaynetwork.The case in which a trust model is not used is alsopresented.We can observe how in the collusive scenario,thebehavior of the subjective approach is almost equal to thenon-collusive case,i.e.,this approach is immune to this kindof attacks.This arises from the idea itself of a subjectiveapproach.Indeed,when a node requires the trustworthinessvalues of a member inside a collusive group,the only infor-mation it needs to know from other nodes,and that can thenbe malicious,are those related to the indirect opinion (eq.(6)),since all other information is stored locally in the node itself.This information is weighted with the credibility of the nodethat provides it (eq.(7)),which depends on the node ownexperience only.We want now to show the robustness of our approachaccording to the malicious nodes concentration.In all the ex-periments we collect the output after 11000 casual transactionshave been completed in the network so that the system is ina steady state.Then we perform 100 additional transactionsto study the system behavior in response to different valuesof the concentration of malicious nodes mp in both non-collusive and collusive case (ﬁg.2) and for overlay and non-overlay structures.We can observe that there are only slightdifferences between the different conﬁgurations.Thus,ourapproach is robust to collusive behavior and it is able to isolate2210%20%30%40%50%60%70%00.10.20.30.40.50.60.70.80.91Fraction of malicious nodesUnsuccess rate

threshold=0threshold=0.4Fig.3.Trust computation overheadmalicious nodes in the route.However,in our approach theerror percentage never exceed the 15%.So far,we demonstrated how the proposed approachdeals against malicious behavior.We are now interested inevaluating the runtime overhead and how it scales with respectto the number of nodes.In our model,every node storesthe information about the trust value locally.When a nodeneeds to know the trustworthiness of another node,it uses theinformation about its own experience and asks to its friendsfor their opinion.These operations are replicated at each hopduring the discovery of the nodes that can provide the service.The request for friends’ opinion can be accomplished byasking to all of them (ﬂooding) or only to that friends thathave a trustworthiness above a certain threshold.The runtimeoverhead is then strictly correlated to the number of hopsbetween requester and provider.The results about runtimeoverhead for different number of nodes and 100 transactionsin this case are shown in ﬁg.3.If we analyze this behavior,we have to consider thatservice discovery and trustworthiness computation can becarried out at the same time.Moreover,we have considered theservice providers are uniformly distributed over the network,while it has been proved that friends share similar interests,the so-called homophily [17],so that it is highly probable toﬁnd a service in the friends list.These observations can reducethe runtime overhead in our approach,but,at this time we donot have enough information to take them into account.VI.CONCLUSIONSIn this paper we have focused on the management andevaluation of trustworthiness in the SIoT context to allowobjects to interact in a safe and resistant way to maliciousattacks.To this end we have ﬁrst analyzed the factors thatinﬂuence the evaluation of trustworthiness and then we haveproposed a subjective approach,where each node has its ownview of the network.To demonstrate the effectiveness of ouralgorithm against malicious behaviors we have run a largesimulation campaign and have shown strong and weak aspectsunder several point of views.REFERENCES[1] L.Atzori,A.Iera,and G.Morabito,“The internet of things:A survey,”Computer Networks,vol.54,no.15,pp.2787 – 2805,2010.[2] P.Mendes,“Social-driven internet of connected objects,” in Proc.of theInterc.Smart Objects with the Internet Workshop,25th March 2011.[3] L.Ding,P.Shi,and B.Liu,“The clustering of internet,internet of thingsand social network,” in Proc.of the 3rd Inter.Symp.on Knowl.Acquis.and Modeling,2010.[4] E.Kosmatos,N.D.Tselikas,and A.C.Boucouvalas,“Integrating rﬁdsand smart objects into a uniﬁed internet of things architecture,” Advancesin Internet of Things,vol.1,no.1,pp.5–12,2011.[5] L.Atzori,A.Iera,and G.Morabito,“Siot:Giving a social structure tothe internet of things,” Communications Letters,IEEE,vol.15,no.11,pp.1193 –1195,november 2011.[6] P.Resnick,K.Kuwabara,R.Zeckhauser,and E.Friedman,“Reputationsystems,” Commun.ACM,vol.43,pp.45–48,December 2000.[7] S.D.Kamvar,M.T.Schlosser,and H.Garcia-Molina,“The eigen-trust algorithm for reputation management in p2p networks,” in Proc.WWW’03.New York,NY,USA:ACM,2003,pp.640–651.[8] A.A.Selcuk,E.Uzun,and M.R.Pariente,“A reputation-based trustmanagement system for p2p networks,” in Proc.of CCGRID 2004.Washington,DC,USA:IEEE Computer Society,2004,pp.251–258.[9] R.Jurca and B.Faltings,“An incentive compatible reputation mecha-nism,” in Proc.AAMAS’03.New York,NY,USA:ACM,2003,pp.1026–1027.[10] Z.Liang and W.Shi,“Enforcing cooperative resource sharing inuntrusted p2p computing environments,” Mob.Netw.Appl.,vol.10,December 2005.[11] Y.Wang and J.Vassileva,“Bayesian network-based trust model,” inProceedings of the 2003 IEEE/WIC International Conference on WebIntelligence,ser.WI ’03.Washington,DC,USA:IEEE ComputerSociety,2003.[12] B.Yu,M.P.Singh,and K.Sycara,“Developing trust in large-scalepeer-to-peer systems,” in Proc.of First IEEE Symposium on Multi-AgentSecurity and Survivability,2004,pp.1–10.[13] L.Xiong and L.Liu,“Peertrust:Supporting reputation-based trustfor peer-to-peer electronic communities,” IEEE TRANSACTIONS ONKNOWLEDGE AND DATA ENGINEERING,vol.16,pp.843–857,2004.[14] L.Freeman,“Centrality in social networks conceptual clariﬁcation,”Social networks,vol.1,no.3,pp.215–239,1979.[15] S.Kosta,A.Mei,and J.Stefa,“Small world in motion (swim):Modelingcommunities in ad-hoc mobile networking,” in SECON 2010,june 2010,pp.1 –9.[16] J.Leguay,A.Lindgren,J.Scott,T.Friedman,J.Crowcroft,and P.Hui,“CRAWDAD data set upmc/content (v.2006-11-17),” Nov.2006.[17] H.Bisgin,N.Agarwal,and X.Xu,“Investigating homophily in onlinesocial networks,” in Proc.WI-IAT 2010,vol.1,31 2010-sept.3 2010,pp.533 –536.23