Yes MacFuse does it. Just that the download page for the sshfs binary gives you the binary with a .gz extension. So you have to rename it instead of gunzip'ing it. Annoying but okay now! Thanks
–
YarFeb 7 '10 at 16:32

You need to have rsync on the remote host -- not necessarily an rsync server (which, I believe, won't handle the ssl connection anyway), but rsync on the local box needs to talk to an rsync on the remote box. You're using ssh to link the two.

Assuming rsync does exist on the remote box, it is not in the default search path for your ssh logged in shell. Depending on your preferences and permissions on the remote box, you can try

using the --rsync-path= option

changing the default PATH for your remote ssh shell

adding an alias for rsync in the remote ssh shell

changing the executed remote command to the absolute path to rsync

or something similar. If rsync as an executable (vice server) does not exist on the remote host, it would have to be added.

There is not much point using rsync without running an rsync daemon on the remote host, in spite of the creative sshfs solution proposed above. The whole idea of efficiency in rsync is to have the two instances compare their local copies of the dataset by calcuating checksums locally, comparing them over the network, and only transmitting the actual data for those parts of those files which actually differ.

If you use sshfs or any network mount, it's only hiding the fact that it has to copy the entire dataset across the network from the remote host in order to calculate the checksums locally, and then transmit the differences on top of that. It would be quicker and cheaper to delete everything locally and 'scp -r' the lot.

Not exactly, because rsync checks whether files are modified by size and timestamp, and that's still a lot more efficient than scp'ing all files. But indeed, the rsync incremental update mechanism within files doesn't make sense over sshfs.
–
elderingJul 14 '11 at 12:52