Hey all,I need to ask this again. Let me put a little more info with it though so you understand what I am doing. I own my own small business selling card models. I have a horrible time with pirates stealing my work. Not really my work but the work of all the designers that make the models to sell. I legally represent them. I need to go to a website and go through the entire sites to scrape all the URL's of the illegal files from hosters like hotfile, rapidshare etc etc. I spent about 20 hours working on one site over the last week and bam overnight they are replenished.

Well, I'm of the opinion that if you make the legal way to get something the most convenient, the vast majority of people will use it rather than the illegal method - online TV for instance. Nevertheless, I found the exercise amusing, so add the following to ~/.bashrc: "function scrape { wget $1 -qO - | sed 's/"/\n"\n/g' | sed '/http/!d'; }", and from then on you'll be able to use e.g. "scrape www.linuxmint.com" to get a list of addresses (not necessarily valid ones though).

If you have a question that has been answered and solved, then please edit your original post and put a [SOLVED] at the end of your subject headerHint - use a google search including the search term site:forums.linuxmint.com

to add the line to .bashrc, press ALT+F2 and type "gedit .bashrc" into the dialog, then copy and paste the line into it before saving and exiting. You'll need to use the terminal to actually use the command though.

If you have a question that has been answered and solved, then please edit your original post and put a [SOLVED] at the end of your subject headerHint - use a google search including the search term site:forums.linuxmint.com