MENLO PARK, Calif. – Silicon Valley giant Facebook has outlined plans to use third-party fact checkers to vet some of its news content.

Facebook has come under pressure from users on both the left and right, decrying its use of data collection and charging that it’s attempting to control debate on hot-button issues.

The fact-checking plan appears to be an acknowledgement of sorts that new media may need a filter.

Gabriel Kahn, a journalism professor at the University of Southern California, says news today is different than in the past when there were fewer sources of information.

“Because there were so few sources, those sources took it upon themselves, for a number of reasons, to act more responsibly and produce, essentially, fact-based journalism,” he states. “Not that that system was perfect, but the expectation of the news consumers was that the news organizations were actually giving them real, fact-based journalism.”

But critics of this position say injecting people into the process to curate news for the public is a poor replacement for equipping the population with a firm understanding of civics, so people can grasp the facts for themselves.

Critics also warn that restricting content could curb free speech and limit access to a variety of points of view.

Facebook says it will test the new system first with a small number of users.

Kahn says the natural expectation is that these human fact checkers will make mistakes, that the system will be human.

“Now, the ecosystem has completely changed, yet the notion of responsibility hasn’t filtered down to the public,” he points out. “Essentially, what’s going on is that we’ve devolved the responsibility for fact-based news from the news organization to the consumer.”

Kahn is reluctant to admit that who decides what’s “fake” or “real” has a political dimension.

Facebook says it’ll be targeting “clear hoaxes spread by spammers for their own gain.”