AI startup enlisted for project to predict suicide risk in Canada

The Public Health Agency of Canada is turning to an Ottawa artificial-intelligence startup that studies social media to predict potential spikes in suicide rates – information that could give the federal government a heads-up to proactively deploy mental-health resources to at-risk communities where lives might be in danger.

Advanced Symbolics Inc. uses several types of AI processes to scan public social-media posts and discussions to assess topics, patterns and tone. The company and its clients can then use that information to predict – and, in the case of self-harm or suicide, try to protect against – future incidences. The project will stop short of individual surveillance, both the company and the government said: Only public social-media posts will be included, and neither party will have access to any user's identifying information.

The government has enlisted Advanced Symbolics for a pilot project to identify patterns among Canadian social-media users who discuss "suicide-related behaviour." The initial $25,000 contract will run from January through June of this year. If the pilot succeeds, the company could be called upon to regularly report on such social-media trends so that the government can anticipate the behaviour and take preventive actions. The total project could be worth up to $400,000, with five annual contract-extension options worth $75,000 each, according to a document on the Public Works and Government Services Canada website.

Story continues below advertisement

Advanced Symbolics was formed in 2015, but is built on AI research conducted by its chief scientist, University of Ottawa adjunct professor Kenton White, dating as far back as 2009. The company calls its AI system Polly. With Polly, Dr. White says, the company plans to look backward for trends surrounding rising suicide rates in specific communities and regions. "What are the precursors? What are patterns leading up to past tragedies? We can see those patterns occurring, and then bring the resources in," he said in an interview.

CBC first reported on the contract on Tuesday. In an e-mailed statement, a Public Health spokesperson said the pilot is intended to build upon the department's existing data on suicide attempts from hospital data. Suicide, they said, is the second-leading cause of death of 10-to-19-year-olds in Canada. "To help prevent suicide, develop effective prevention programs and recognize ways to intervene earlier, we must first understand the various patterns and characteristics of suicide-related behaviours," the spokesperson wrote.

Both Dr. White and Public Health told The Globe and Mail that, for privacy reasons, the pilot project is not intended to forecast for, or intervene in, individual cases. Rather, it will seek out regions where social-media patterns predict rising rates could occur, in order to target prevention programs and bring in mental-health resources. Attawapiskat First Nation in Northern Ontario declared a state of emergency in 2016 over suicides and suicide attempts. One Nova Scotia school board saw three students die by suicide last year.

Dr. White describes Advanced Symbolics' business model as an AI-assisted improvement upon traditional market research. Rather than working with small-sample phone interviews, like many polling firms, the company can study the behaviours of 150,000 anonymized Canadians. The private company has had no funding rounds, he said.

The eight-person Ottawa startup was selected for the pilot, the announcement document says, because of its patented algorithm that builds geographically specific, randomized, controlled samples of social-media users. But Polly also uses other artificial-intelligence methods. Machine learning, for instance, helps it better classify users within its sample to match a population; text analytics and natural-language processing help the company assess if a person is discussing the topic it wants to learn more about.

Working with Polly, he pointed out, is like working with a student: The AI provides patterns, experts provide feedback and the process continues until accuracy is high.

For the Public Health pilot project, that means working with mental-health experts who have experience detecting warning signs for suicide and self-harm.

We have closed comments on this story for legal reasons or for abuse. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.