Share this post

Link to post

Share on other sites

draco 0

I don't really see the point. Your computer will process all the commands as quickly as it can. Running multiple scripts doesn't increase the speed at which it will run through the functions beyond however many processors you have. Each processor can only do one thing at a time, it just makes you think that it's doing more than one thing at a time because it's so fast.

Share this post

Link to post

Share on other sites

smashly 11

I don't really see the point. Your computer will process all the commands as quickly as it can. Running multiple scripts doesn't increase the speed at which it will run through the functions beyond however many processors you have. Each processor can only do one thing at a time, it just makes you think that it's doing more than one thing at a time because it's so fast.

The point would be:

Say Sub(1) takes 15 mins to complete and the pc CPU and Memory usage are at 2% use while it's doing it's thing.

Sub(2) takes 17mins to do it's thing and pc CPU and Memory usage are at 2% ...

Sub(3) takes...... etc

So what I should wait 32mins for Sub(1) and Sub(2) to finish or run the both at the same time and have the task done in roughly 18 mins using roughly 5% of my CPU and Memory..

So to me there is a great point in asking something like the op asked IMHO.

I agree that you could never get all Subs starting at the exact same nano second, but maybe milliseconds apart wouldn't be so bad.

Pipe the subs from the main script either into a tmp file and run autoit commandline against each tmp file script.

This leaves the main script still able to do other things and the subs can get their job done in a shorter amount of time.