Things you like…

On Friday Facebook said that they are working with suicide preventional group & collecting phrases, hashtags and group names linked with online challenges such as Blue Whale game that encourage self-harm or suicide.

“We offer resources to people that search for these terms on Facebook,” the social media giant Facebook said.

The Blue Whale game or Blue Whale Challenge is a suicide game wherein a group of administrators or a certain curator gives a participant a task to complete daily — for a period of 50 days — the final of which is the participant committing suicide. Participants are expected to share photos of the challenges/tasks completed by them.

These daily tasks start off easy — such as listening to certain genres of music, waking up at odd hours, watching a horror movie, among others, and then slowly escalate to carving out shapes on one’s skin, self-mutilation and eventually suicide.

Facebook said it will also remove content that violates our Community Standards, which do not allow the promotion of self-injury or suicide.

Starting on World Suicide Prevention Day on September 10, Facebook said it would also connect people with information about support groups and suicide prevention tools in News Feed.

“Facebook is a place where people connect and share, and one of the things we have learnt from the mental health partners and academics we have worked with on this issue, is that being connected is a protective factor in suicide prevention,” said Ankhi Das, Director of Public Policy for Facebook in India, South and Central Asia.

Additional resources about suicide prevention and online well-being will also be added to its Safety Center, Facebook said.

With these resources, people can access tools to resolve conflict online, help a friend who is expressing suicidal thoughts or get resources if they are going through a difficult time.

“We care deeply about the safety and millions of people who use Facebook to connect with the people who matter to them, and recognize there’s an opportunity with these tools and resources to connect someone who is struggling with a person they already have a relationship with,” Das said.

Facebook’s Safety Center also offers guidance for parents, teenagers, educators, and law enforcement officials to start a conversation about online safety, with localized resources and videos available.

People can also reach out to Facebook when they see something that makes them concerned about a friend’s well-being.

“We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports of suicide. For those who reach out to us, we provide suggested text to make it easier for people to start a conversation with their friend in need,” Facebook said.

“We provide the friend who has expressed suicidal thoughts information about local help lines, along with other tips and resources,” it added.