I'm using Ubuntu 10.04. I have a large directory with (many sub-directories) and data files I use quite often in my scripts. I would like to make sure I never accidentally remove/overwrite/change any of the files/directories in this super directory.

Create a copy of the tree with hard linked files and point your script to the copy. Then your scripts might still modify existing files, but if they create, remove or replace files, that will only affect the copy.

There is no real way because no matter what, you'll be always able to delete them in the end.

Still, you could try with file permissions. Right-click on the directory and set folder access to Access files and set file access to read-only. After that click on Apply Permissions to Enclosed Files. This way, you'll have to manually unset this setting when you decide to delete files in that directory.

To err is human. I once deleted an important file because I just woke up and I was still sleepy (no hangover) :) Fortunately, it happened long after I learned some tricks that help mitigating the negative impact of such silly actions. Here they are:

always make backups, periodically make sure that the backups indeed contain what they are supposed to contain, to avoid the situation when you lose a file, try to restore it from a backup and discover that for some reason it's not contained in the backup

use version control

use an accounting system like inotify, it will help you track back filesystem changes

From that incident I've set up an additional adamant rule: "Never touch a computer when still sleepy".