Your Facebook News Feed Is Changing: Here's What to Expect

Friends and family are getting new priority.

Facebook admitted Thursday that constantly scrolling through impersonal, branded content is making us feel bad, so it’s changing the News Feed. The social media platform will now prioritize content that’s shared and commented on by friends and family and will give brands and publishers less air time.

Why Is the News Feed Changing?

It’s almost like a vague admission to the fact that social media use may be having a negative impact on our mental health, as studies have shown in recent years. Although a lot of that research doesn’t blame impersonal, public posts, this is clearly Facebook’s way of attempting to encourage engagement and a positive attachment for users to a website that has increasingly become filled with media and branded content.

It could also be a prudent business move in terms of a long game. If Facebook can encourage more positive feelings from users, it’s more likely to remain in place as the U.S.’s most used social media platform.

Article continues below

The revolution will not be televised. It'll be sent to your inbox by us.

Sign up for our newsletter:

What Does This Means for Your News Feed?

Facebook said in a statement Thursday that the News Feed update will employ Facebook’s algorithms to “predict which posts you might want to interact with your friends about, and show these posts higher in feed.”

These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to – whether that’s a post from a friend seeking advice, a friend asking for recommendations for a trip, or a news article or video prompting lots of discussion.

Pages that generate more conversation from a users’ community will probably see more play in a user’s feed than pages that don’t. If there’s a particular public page that you want prioritized in your feed, you can choose “See First” in “News Feed Preferences.”

Whether this move could further encourage echo-chambers — where users tend to see content they will agree with, while other information is blocked out, thus reinforcing biases — is yet to be seen. During the recent congressional hearings on Russian interference in the 2016 election, Facebook came under significant fire for not having found ways to stop foreign actors from posting false and misleading information and manipulating voters by pandering to their value systems. Many of these posts came from fake personal accounts, not brands or publishers.