Pain recognition system designed to help sheep welfare

A new artificial intelligence system has been developed to detect whether or not a sheep is in pain judging by facial expressions.

Designed by researchers at the University of Cambridge, the innovation could help diagnose and treat common but painful conditions in animals.

By recognising five different facial expressions expressed by a sheep, it will be able to detect and estimate whether the animal is in pain and the severity of its displeasure, subsequently improving sheep welfare.

Using earlier computer work that detects emotions and expressions in human faces, the programme can translate this into a sheep subject. It will assess distinct parts of a sheep’s face and compare it with a standardised measurement tool developed by veterinarians for diagnosing pain.

The innovation was inspired by a previous project in 2016, in which Dr Krista McLennan, an animal behaviour lecturer at the University of Chester, developed a Sheep Pain Facial Expression Scale (SPFES). The tool is used to measure pain levels based on facial expressions, although training people to use it was time-consuming and individual bias threatened consistent scores.

Building on this, research leader Professor Peter Robinson, of the Computer Laboratory at the University of Cambridge – who usually focuses on teaching computers to recognise emotions in human faces – met with McLennan to develop the project to read animal emotions.

“There has been much more study over the years with people,” said Robinson. “But a lot of the earlier work on the faces of animals was actually done by [Charles] Darwin, who argues that all humans and many animals show emotion through remarkably similar behaviours, so we thought there would likely be crossover between animals and our work in human faces.”

The SPFES recognised five main things that happened to a sheep’s face when it was in pain. They were: narrowing eyes; tightened cheeks; ears folded forward; lips pulled down and back; and nostrils changed from a U shape to a V shape. The SPFES then ranked these characteristics on a scale of one to 10 dependent on the severity of pain.

“The interesting part is that you can see a clear analogy between these actions in the sheep’s face and similar facial actions in humans when they are in pain – there is a similarity in terms of the muscles in their faces and in our faces,” said co-author Dr Marwa Mahmoud, a postdoctoral researcher in Robinson’s group. “However, it is difficult to normalise a sheep’s face in a machine learning model. A sheep’s face is totally different in profile than looking straight on, and you can’t really tell a sheep how to pose.”

Researchers used a series of about 500 images of sheep to train the programme. The photos were labelled according to the sheep’s faces and ranked their level of pain according to SPFES. Early tests confirmed that it was able to estimate pain levels with an 80% degree of accuracy.

The next step is to train the system to recognise faces from moving images, and when a sheep is looking straight on or is being profiled.