Robots.txt block all

Just getting to grips with a duplicate content issue on my site. To be brief using a combination of the plugin All In One SEO and a robots.txt file.

I can't understand why the suggested robots.txt file for WordPress, (posted on the WP support forum I think), disallows all your PHP pages.

I am sure I am not the only one who has php pages on their site that they want to have indexed.

So,
1. Why disallow the WP PHP pages
2. When I understand that it is a good thing to do, how do I disallow those pages but allow the ones I want Google to find?

Thanks in advance,

Ade

well some files just weren't meant to be seen. I just keep the wp-core files and some processing files like .dll from public view for potential security reasons. They don't affect your rankings or anything. Here's my robots.

User-agent (*) talks to all crawlers.
then you disallow or allow whatever you want indexed.

The foundation of second law of thermodynamics was laid by the inventions made by Sadi Carnot, a young French scientist considered to be the...

It's Interesting

Gareth Branwyn (born January 21, 1958) is a writer, editor, and media critic.
He has covered technology, DIY media, and cyberculture for Wired, Esquire, the Baltimore Sun and other publications. He has also been an editor at Mondo 2000, and at Boing Boing when it...

thesearesongsSunday 13, December 2015 01:46 AM

@Sadvent Calendar: literally the first two characters that you see in the first star wars movie are two dumbass robots