The SitePoint Forums have moved.

You can now find them here.
This forum is now closed to new posts, but you can browse existing content.
You can find out more information about the move and how to open a new account (if necessary) here.
If you get stuck you can get support by emailing forums@sitepoint.com

If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

no no, that is not what I mean. You can use programs to download/grab a COMPLETE website (all pages including images etc). I know there is a way to prevent it (by using .htaccess) but I dont remember where I read it

No, not really. For someone to view the site, one essentially needs to be able to download the entire thing. One could ban any website grabbing software which declared itself in the browser type header. Then again, most of it allows one to masquerade as interenet explorer. If it is so important that it cannot be shared, it probably should not be on the public internet, no?

If it is so important that it cannot be shared, it probably should not be on the public internet, no?"

That is not the issue ... I have +10.000 pictures posted at atleast as many pages. If someone grab the entire site it will cost alot of bandwidth... and I dont want some moron to copy my presious content so easy.

I know some people here at SPF use htaccess to prevent this, I just cant find the threads.

If it is so important that it cannot be shared, it probably should not be on the public internet, no?"

That is not the issue ... I have +10.000 pictures posted at atleast as many pages. If someone grab the entire site it will cost alot of bandwidth... and I dont want some moron to copy my presious content so easy.

I know some people here at SPF use htaccess to prevent this, I just cant find the threads.

First of all you probably want to set up robots.txt so that images are out of bounds, that will prevent google indexing your pictures, for instance.

That may be enough.

Then you could use server-side software to limit the amount of bandwidth you will serve to to a single IP address. Of course a devious grabber could use multiple addresses, but that's unlikely.

Of course it would also stop legitimate users from browsing the whole site.

Putting exclusions in your robots.txt is a good step, although I suspect that site ripping software wouldn't pay much attention to that.

What you are looking for sounds like a Bot trap.

Basically, place a link on your pages (one that is hidden from legitimate users). Site ripping software will follow every link that it finds to download the content. If someone/something follows your hidden link then they're probably up to no good (and you can opt to block them).