Facebook is expanding its suicide prevention tools to include the use of artificial intelligence so the network can find potentially suicidal people who are livestreaming. Reporting options are available in case someone is concerned about a streamer, but AI may also step in without any reports.

If Facebook's AI notices something, a streamer may be presented with contact info for the Crisis Text Line, the National Eating Disorders Association, and the National Suicide Prevention Lifeline.

Tools of this sort first appeared in Facebook last year, with the network letting users flag content for review.