Open file to read and write in Perl, oh and lock it too

In most cases when you need to updated a file the best strategy is to read the entire file into memory,
make the changes and then write the whole file back. Well, of course unless the file is too big, which is
a separate story.

To demonstrate that we will run the script 1000 times in two separate windows.

IF you are using Linux or Mac you can use the following Bash snippet:

for x in {1..1000}; do perl counter_plain.pl; done

I have not tried this on Windows, and because it has a different file-locking methodology
the results might be totally different.

If you execute the above command in two terminals at more or less the same time, you'll
see the numbers progressing, but they'll not reach 2,000. They might even get reset
to 1 from time-to-time as the file operations of two instances of the script collide.

Locking

On Unix-like operating systems, such as Linux and OSX, we can use the native file locking mechanism
via the flock function of Perl.

For this however we need to open the file for both reading and writing.

We could not open the file separately once for reading and once for writing,
the closing of the filehandle always frees the lock. So the other instance of our script
might come between the two open calls in our instance.

We needed to rewind the filehandle (using seek) so we write the new content at the beginning of the file and not
at the end.

In this case we did not have to truncate the file as the new content is never going to be shorter than
the old content (after all the numbers are only incrementing), but in the general case it is a better practice.
It will ensure that we don't have left-over content from the previous version of the file.

If you try to run this script 1,000 each in two separate windows you'll see it reaches 2,000 as expected.