Say I have a script that executes after DOM Ready. It loads more resources (like a twitter feed and the like). So, it adds up to the total experience of 'waiting' for the page to load, though it isn't content critical.

@toomanyairmiles That's not a duplicate. It's not asking if google can execute js, it's asking whether the load time of other resources using ajax is counted as "page load time", thus hurting SEO.
–
Florian MargaineMay 22 '12 at 7:12

@FlorianMargaine the answer to both questions is the same and linking between the two is useful - I didn't say it was a duplicate, I said it was a possible duplicate. It's also a question which is asked regularly.
–
toomanyairmilesMay 22 '12 at 8:32

Perfect question! ive been looking for something like this for a while and i finally decided to check.
–
somdowNov 6 '12 at 12:10

1 Answer
1

We already do some pretty smart things like scanning JavaScript and
Flash to discover links to new web pages

They have been executing JavaScript since at least 2009 (I imagine they learned a lot from building Chrome), and Matt Cutts has publicly confirmed that they can follow javascript links, execute scripts and submit forms.

"For a while, we were scanning within JavaScript, and we were looking
for links. Google has gotten smarter about JavaScript and can execute
some JavaScript. I wouldn't say that we execute all JavaScript, so
there are some conditions in which we don't execute JavaScript.
Certainly there are some common, well-known JavaScript things like
Google Analytics, which you wouldn't even want to execute because you
wouldn't want to try to generate phantom visits from Googlebot into
your Google Analytics".

The upshot of this is that, no, it isn't hurting your SEO, provided your JavaScript is well formed and relevant. I'd recommend using fetch as Googlebot in webmaster tools and turning on server logging for the Googlebot and watching where it's going if you really want to check.