If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Idea to stop spam bots

Has anyone thought of using .mousemove() (jquery) to detect spam bots?
Because (I'm assuming) spam bots wouldn't need to move the mouse while navigating a website (and therefore wouldn't) and normal users do, couldn't you use this to trigger some extra security to stop them? (Like a harder validation that is to time consuming or annoying to serve up to everyone)

Something like

Code:

$('body').mousemove(function() {
//set not a spam bot
});

Admittedly, this could cause some problems for mobile users... but maybe you only run it for non-mobile devices.

Well, JS doesn't run for spam bots. At least it doesn't always. So, that means that this method is good in that it's something that real users would do, rather than something that would actually detect spam bots. But the downside is that it's client side, so that they could read the code and do the same thing. If someone is specifically targeting your site, it wouldn't help. It would catch generic bots though. It would catch some real users too, but I guess just harder validation might work out.

No.... No I didn't.
That was a funny tv show... Especially when the asteroid hit the earth... Oh back in the young'en days

Hmmm.... I didn't think of that Daniel.
Maybe not having javascript on in it's self is enough to trigger extra security.... Would noscript tags (don't judge me) work on a spam bot application?
I assume not... but maybe...

You'd have to create an exception for non-bots. Use JS to add a hidden field to the form with some value, either "not-a-bot" or maybe an MD5 string that will match something stored in a session var on the server.
Thus, if someone HAS this, it's NOT a bot. If they don't, it MIGHT be a bot.

Unless of course, as I said, the bots learn to use the material because it is client side.

Perhaps AJAX... they could still theoretically get it, but it wouldn't be in the raw source code, so it would take a lot more work for them. How's that sound?

I was thinking a php session?
If the user has javascript disabled, then the session won't work providing a backup... (E.g. if session var set and session var = true, no extra security else extra security end; )
And then right the session using ajax.

And if they've got javascript disabled, they couldn't accept cookies...
If I were making a bot, I'd disable javascript... it'd mean that half the validation on the website wouldn't work (the other half being backend) increasing their chances of getting a succesful hit.
I'm fairly sure that it was you who said that you couldn't use cookies to deny access to a site Daniel. (I was working on a way to stop brute force attacks and I was using a cookie to log the number of form submissions. You said that I couldn't do that because spam bots wouldn't do cookies or something like that)

I'd actually say that it's the less advanced bots who accept cookies and the more advanced ones that don't.... maybe.
But programming wise, it's not very hard to disabled javascript in a webbrowsing application. (depending, of course, on the language of their choice)

Cookies and Javascript aren't the same thing. They can work independently. A well-programmed bot WILL accept cookies for exactly that reason. Otherwise CAPTCHAS would always work even if they displayed just "1" each time-- they use a session to store the CAPTCHA's correct response.

By "advanced" for bots I'm referring to how they're programmed. It's very easy to program a "bot" that just grabs the HTML as text from a page. I can do that in about 30 seconds with PHP, plus however much time it takes to add the functionality you'd like, such as submitting a form via POST. But it is much more difficult to built a browser for that bot-- cookies, Javascript, HTML, etc. For example, a good CAPTCHA question would involve questions about the geometry of the page-- even though the bot technically has that information it wouldn't see it as shapes but just as plain text un-rendered for graphics. The relatively easiest way to build a bot that can do those things would be to attach it to a browser, with the bot guiding the browser. The harder way would be to build that functionality in from scratch just for the bot.