"Facebook regularly meets with law enforcement and responds to the legal process."

Facebook has a history of working with law enforcement investigating potential criminal activity as it relates to the site’s live streaming video capabilities, the social media network told News One exclusively Tuesday afternoon. While Facebook would not confirm whether it had been in contact with police investigating the viral video surrounding the death of a teenager in Chicago last weekend, a spokesperson said the company has policies in place to deter, prevent and address certain behavior by users, such as bullying, suicide and celebrating a crime.

Citing security concerns, Facebook declined to officially comment on this specific case “because all the facts haven’t been sorted out,” company spokesperson Ruchika Budhraja told News One in a statement. However, she added, “Facebook regularly meets with law enforcement and responds to the legal process in situations like this.”

Kenneka Jenkins went missing early Saturday morning after a night of partying with friends in a suburban Chicago hotel. The 19-year-old was ultimately found dead in a walk-in storage freezer hours later in the same hotel. That much has been confirmed. But shortly after her death was reported, a Facebook Live video went viral that featured Jenkins’ friend Irene Roberts speaking and may have even featured Jenkins herself.

But everything else after that has proven to be a mystery, including how Jenkins died. Twitter users have alleged Jenkins was raped after being set up by her friends, but that theory was not immediately confirmed.

Police initially said Jenkins was drunk and “staggering” before she let herself into the freezer, but her mother has disputed that account. “Those were double steel doors, she didn’t just pop them open,” Tereasa Martin said at the time. Martin also said Jenkins’ friends have changed their stories multiple times.

Facebook has a set of community standards to safeguard against any forbidden content on the network, Budhraja said. Any violation of those standards results in the content in question being taken down within 24 hours, at most. In addition to the existing 3,000 people, Facebook expects to have about 4,500 more working in that capacity by the end of the year.