The Internet has become a vast terrorism tutorial for would-be jihadists, and social networking sites should create a zero-tolerance policy for online terrorism and hate groups, a leading human rights organization says.

A pro-terrorism subculture is increasingly easy to find online, says Rabbi Abraham Cooper, associate dean of the Simon Wiesenthal Center in Los Angeles. The center works to end anti-Semitism, hate groups and terrorism. Overseas terror groups now tell potential recruits, "You don't have to train here. You can just download the instructions and take action right where you are," he says.

Cooper will present the center's 2013 Digital Terrorism and Hate report at a briefing Wednesday on Capitol Hill. The briefing is being held by Reps. Eliot Engel, a New York Democrat who is the ranking Democrat on the House Foreign Affairs Committee, and Ed Royce, a California Republican who chairs the committee.

The kind of information available online is changing rapidly, Cooper says. "Three years ago it was almost all in Arabic," he says. "Now it's in English and multiple other languages."

"I don't think there's any question that terrorist instructional materials are more available online than they used to be," says Mark Potok, senior fellow at the Southern Poverty Law Center, which monitors domestic hate groups. "The Boston bombers are the classic example of that. They got the pressure-cooker bomb instructions online."

The Wiesenthal report, now in its 15th year, assigns letter grades to online companies based on how they deal with hate and terror material. Facebook got an A-minus because of its effort to eliminate digital prejudice and hate on the site. YouTube was graded C-minus because of the number of how-to terror videos on the site.

"There's still just way too much how-to info on YouTube. There's no reason for it. It technically violates their rules but they don't look into it," Cooper says.

YouTube responded with a written statement: "YouTube's guidelines prohibit material intended to incite violence, train terrorists, or that contains hate speech, and we encourage people to flag anything they see that they believe breaks the rules. We review flagged videos around the clock, routinely removing material that violates our guidelines.

Twitter got an F because "you can post anything you want without being screened or removed," Cooper says.

Twitter did not respond to a request for comment.

Cooper plans to tell Congress that "draconian government oversight" isn't necessary, that Internet companies simply need to remove material that violates their own terms of use. "Their platforms are being leveraged by the extremists," he says.

That's no easy task, Potok says. "It's easy to say these companies should police themselves strenuously," he says. "The reality is there is such an enormous amount of information going out every minute of every day that it's very tough to keep up with it."