I'm thinking of a situation where I would have something that creates a copy of a directory, tweaks a few files, and then does some processing on the result. This wold be done fairly often, maybe a few dozen times a day. (The exact use case is testing patch submissions; dupe the code, patch it, build/test/report/etc.)

What I'm looking for could be done by creating a new directory structure and populating it with hard links from the origonal. However this only works if all the tools you use delete and recreate files rather than edit them in place.

What kind of system is this expected to be used on?
–
John GardeniersApr 8 '10 at 4:56

It's speculative enough that unless you know of a way to do it for some setup (in that case just post it) I wouldn't want people burning time figuring out how to do it (unless they are doing it for fun :)
–
BCSApr 8 '10 at 15:19

2 Answers
2

If you have your choice of platform for your fileserver, I'd go with a recent OpenSolaris build and use the deduplication feature of ZFS. This way copies of files would take up no additional space, and even common segments between files would not be replicated.