Perlbeginner1 has asked for the
wisdom of the Perl Monks concerning the following question:

good day dear monks

this is a question regarding a storing issue - the main part of a little mecha-script works well - i want to store the results in a directory - so that the gathered fiels do not mess up my machine... so here we go... with the outline and the question... look forward to hear from you

i need to have some thumbnails from websites but i tried to use wget - but that does not work for me, since i need some rendering functions what is needet: i have a list of 2,500 URLs, one on each line, saved in a file. Then i want a script - see it below - to open the file, read a line, then retrieve the website and save the image as a small thumbnail. well since i have a bunch of web-sites (2500) i have to make up my mind about the naming of the results.

Note: all is nice and runs well so far untill - yes untill i tried to create a special option: i wanted to force the script to do some storing of the results in a folder
Well, what do you think about the idea of storing the results in a folder called images or so!?) is this doable? it would help alot since i get stored
the results in a folder. And the many results do not mess the machine...

i run into some issues. tried to do it - to store it in a directory thusly:

You've been posting problems relating to this task since December last year. You have consistently failed to take peoples advice, you have failed to learn from your mistakes, you have pretty consistently failed to provide the code which you're running (the error message for $images you display here is impossible to generate from the code posted). You've been advised many times on these issues both in reply posts and in CB discussions. Either you're not interested in learning how to do this yourself, or you're not capable. If you just want this to work, pay someone to do if for you.