I've been asked to write a script by the company i work for but i'm having trouble getting the data.

details are: script will run from logserv daily via cron and collect files from 4 different servers for the previous day collects files from /home/sys1/event_logs_201211010000.audit

Now i can put the public key on each machine, no issues there, and just use a system call to retrieve the file, my problem is, if the application restart then another files with a different time stamp will be generated. eg /home/sys1/event_log_2012110172314.audit

Ideally id like all 4 files (1 from each server from the previous day) to be collected and stored in 1 file on the logserv in a given directory.

Can anyone offer any suggestions as to how i get round the multiple files should the app restart?

You first said you have no problem to retrieve the files and now ask how to do it. You don't say enough about your environment, but I guess that you can use FTP to transfer your files (see the Net::FTP module on the CPAN). Since you mentioned something about secured connection, you might want to take a look at the Net-FTPSSL module (http://search.cpan.org/~kral/Net-FTPSSL-0.04/FTPSSL.pm) or some other FTP modules on the CPAN.

Sorry, let me explain a bit further, I need to write a script that collects files from 4 different machines and store them as 1, to get the data i connect to each server a store the names of the files into an array.

Thanks for the tip, I've made the amendment and have nearly got it finished, 1 last niggle!!!!!!

The below works brilliantly but if i copy a file with the same date/time from more than 1 host it will get overwritten. What I'm trying to do now is add the $host to the beginning of the file saved I've tried various ways but cant get it to work. The dstfile and basename variables from "use File:Basename" are screwing it up