Still in fall of 2016: All around the world, people are asking “WHO? WHAT? HOW?” That’s when researchers found that American voters were influenced by misinformation on the internet.

The world is completely distressed. They demand that 1) someone be responsible and 2) for them to take action and fix the ‘fake news’ problem.

Oh – hi Larry Page and Mark Zuckerberg.

Who else other than Google and Facebook, right?

The public wants solutions from search engines and social media giants to tackle ‘fake news’ and any other misinformation on the internet.

May 2017: TA-DA! Welcome Project Owl.

Project Owl is introduced as Google’s answer to addressing fake news. It plans to do this with new feedback forms for search suggestions and the answer box, and authoritative content prioritization in the answer box.

And no, we don’t see this affecting marketers or SEOs. As long as you continue to practice white hat methods, your day-to-day should be the same. However, given this can affect searchers’ user experience, we see a few challenges.

Challenge #1: Search engines are supposed to be neutral

Google is walking on a tight rope. If search engines manage to accomplish tackling fake news, then first, that feels like a violation of the first amendment but second, they will come off as bias to specific news/media sources.

Remember, feedback from some users will change the search experience for all on that query. It will be difficult to differentiate what’s ‘right’ for one searcher versus what’s ‘right’ for the other.

But, you know what? When personalized search engines are the new thing, this may not even be a challenge.

Challenge #2: The proposed plan

Let’s take a step back and look at Google’s track record when they are “working to fix” something. Just like many updates in the past, Google says one thing and marketers notice something completely different.

Right now, “Project Owl”, according to Google, will rely on the searcher to provide feedback on the autocomplete or on the featured snippet.

But, we’re missing the obvious.

Let me ask you: When was the last time you went in and changed any of your Google search settings? Or rather, did you even know that it was possible to change Google search settings?

Don’t feel bad – I know SEOs who didn’t even know they could do that!

Google said and I quote, “We plan to use this feedback to help improve our algorithms.” That is what they told us years ago about link disavow and they still don’t have that right. My take is that it will be several years before Google is able to filter out “fake news”.

I personally think TMZ.com spreads lots of fake news, yet they rank for 2,133,648 keywords on Google; and I don’t think Google is going to start taking their keywords away anytime soon.

As you can see I don’t think Google is going to put much into this and even if they do it will take years before it’s perfected. I believe Google is in crisis mode right now but sooner than later people will forget and Google will move on or deprioritize this.

Challenge #3: Obscure and infrequent queries

The third part of Google’s solution is prioritizing authoritative content specifically for obscure and infrequent queries. But, when it’s already such a niche group, how can you determine who that authority should go to?

Challenge #4: The blackhats

Like every other SEO tactic, there is always the one group of SEOs that capitalize on Google making an algorithmic change or giving us the capabilities to affect how the algorithm reacts.

I know blackhat SEOs are going to jump at this chance to devalue other people’s content that don’t serve theirs or their client’s interest. They will probably work from C class IP addresses and run bots on specific timing intervals to make it seem natural.

Now what?

Overall, a first step is better than no step at all, but here are two ways I recommend as a stronger combat against fake news.

First, Google should not only rely on end users to report content that is fake or offensive. Its focus should be less on that and more on perfecting RankBrain, Google’s artificial intelligence.

Second, it’s not just up to the Googles and the Facebooks to take action. It’s also a user’s responsibility to determine whether a search listing is worthy of your click and trust.

When you see something that sounds outrageous, it probably is. Hoaxes appeal to natural human curiosity, which is why it’s hard not to click, but still, that’s a choice you get to make.