Facebook will now use ‘time spent’ on stories as an indicator of their meaningfulness.
Photograph: DADO RUVIC/REUTERS

Facebook has announced the latest change to the algorithm governing what stories its users see in their news feeds on the social network.

The company says it hopes to help more “meaningful” stories bubble up in people’s feeds by looking beyond metrics like comments, likes and shares when judging what’s interesting.

“These factors don’t always tell us the whole story of what is most meaningful to you,” claimed Facebook software engineers Ansha Yu and Sami Tas in a blog post that cited recent market research conducted by Facebook.

“We learned that in many cases, just because someone didn’t like, comment or share a story in their News Feed doesn’t mean it wasn’t meaningful to them,” they wrote.

“There are times when, for example, people want to see information about a serious current event, but don’t necessarily want to like or comment on it.”

The engineers stressed that the algorithm is doing more than simply measuring how many seconds people spend looking at a story, noting that a 10-second period could be enjoyment, or simply a slow internet connection.

“We’ve discovered that if people spend significantly more time on a particular story in News Feed than the majority of other stories they look at, this is a good sign that content was relevant to them,” they wrote.

How does Facebook decide what to show in my news feed?

Read more

Facebook is stressing that pages run by companies and celebrities are not expected to be affected by this change – despite unrest in the past after similar tweaks have meant page posts reach less people on the social network.

Facebook has made a habit of announcing changes to its news-feed algorithm since August 2013, when the company explained that the average user could see 1,500 potential stories from friends and pages every time they log in – a number that the algorithm narrowed down to around 300.

The social network has sparked controversy with some of its changes to the algorithm in the past, however: it was strongly criticised in 2014 after details emerged of an experiment that hid “a small percentage” of emotional words from people’s news feeds to test what impact that had on their own posts.