Don’t Block your CSS or JS

Other Posts

This is for the non-SEO people and a quick blog post about something that is probably flying beneath the radar of most designers.
In case you missed it back in June Yoast explained how blocking access to these files in robots.txt can possibly harm SEO. At the very least go to Webmaster Tools and make sure Google can even render your page. I made adjustments and it’s too soon to see if the traffic loss I have suffered is due to this or because my domain has a keyword in it. The SEO saga continues and so does the SEO frustration.

A month ago Google introduced its Panda 4.0 update. Over the last few weeks we’ve been able to “fix” a couple of sites that got hit in it. These sites both lost more than 50% of their search traffic in that update. When they returned, their previous position in the search results came back. Sounds too good to be true, right? Read on. It was actually very easy. Read More >

This is a theory, but a Google guy responded to this albeit in a cloudy way:

John Mueller of Google wrote:
Allowing crawling of JavaScript and CSS makes it a lot easier for us to recognize your site’s content and to give your site the credit that it deserves for that content. Read more at Google Webmaster Help .

He also later says that basically maybe it doesn’t matter, but one has to think that our sites, whether by adjusting the robots.txt or big G adjusting the product, should always be able to render for Google.