Hi,
I have a site with very active development carried on day-to-day basis. I need to see what files my developer have worked on and need back them up for recovery.
The build of system runs in GB so I don't want to backup all files. Storage is seriously expensive.

I am looking for a linux script to
1. First backup changed files every hour and
2. Then all rsync files from master directory to secondary folder on same or remote server every hour.

Who is Participating?

generally speaking, you would use something like this
you find the files (replace . if you have a specific path and not using current folder)
the mmin -60 is the time specified; searching for anything modified in the last 60 minutes

then filter the results through xargs which will do an rsync to the destination

you can use whatever rsync options you want instead of -tgcop as i used in the example

you can also have multiple rsync processes by adding -P x (something less than 8 for performance reasons) right after xargs though you could leave it out if you have very large files

replace /bar/ with the actual local destination

as far as a remote server is concerned, to automate you would need to setup ssh keys to use without prompting for a password

rsync has an option to do incremental backups by creating hardlinks of files that did not change using --link-dest. it needs a little scripting to be actually usable in cron. here is an example script to handle incremental backups. the script assumes the destination dir to be local but you can adapt differently. it is just simpler to pull rather than push in this context