I currently have a C# script that process's txt files for me,the script parses each file for segments of text, and creates html files based on each match found and then inserts this into a db.

when we run this on c# on a large volume of files, the time taken to complete can run into several hrs and i need to try and get the speed down and am being told by every1 that if its txt parsing then perl is the way to go.

if this is something you may be interested in then please drop me a line

A compiled program written in C# or any other similar low level language will always be faster than a Perl script IF it is written correctly. If the programmer didn't code it correctly, the program, no matter which language it was written in, would be efficient/fast.

Without knowing anything about your current program and parsing details I can't say for sure if a Perl script can process your files any faster, but several hours for a C# program does sound like it may not have been coded correctly.

The txt files we recieve contains cases, there could be be up to 2000 cases per txt file, so our script parses out each case, then run a another parse on the extracted case for extra values, then creates a html and finally inserts all into a DB.

If we run 1 file, it will take around 2 mins, but whilst doing so the memory and cpu will be high, which isnt a problem in itself, but the issue comes in the volume of txt files we recieve, we can get around 2000 a week to process for each client

At present we are in setup so are processing for just test purposes but when we go live, we could recieve 500 a day for each client, so in effect could have multiple scripts all running at the same time, we are running a single thread so it process's one file at a time.

Does perl use less resource on the server than other languages ?

with the possibility of so many scripts running, i need to find the best way to process multiple files at the same time for various clients without placing a burden on the server