rsync hang with many remote files specified -


You can copy many remote files into your local machine by using the following rsync command, when using the remote machine rsyncd Is doing:

rsync -r hostname :: 'module / path1 / file1 module / path2 / file2 module / path3 / file3 module / path4 / file4' / local / Path

If the number of files is very good, however, rsync is hanging, I am not able to create files in the target location on the boundary of the environment I am 22 22. It is finished in a few seconds with 22 files; With 23 or more files, it hangs indefinitely.

Can anyone explain why such a thing can happen? Here are some things I've investigated:

  • Neither the machine is running out of disk space or memory.
  • All files can be transferred, unless I do not do more than 22 at a time (in other words, it is not special about 23th place specific to specific files.) < / Li>
  • There are no permissions issues; I have the ability to read all files on a remote machine, and can write to the target destination. (Also, as I saw in the previous item, all ~ 50 files in my entire list can be transferred until I do more than 22 at a time.)
  • I - Use the n (dry parts) option.
  • There is no explanation that the simplest way to specify different files To re-index an entire directory, leave with rsync, and either - include or - with the appropriate patterns you want to match using the flag Do more control if you need to.

    If your use really needs to explicitly back up each file, use a different text file that lists those files and - files-from Calls to rsync with flag

    I used the - files-from option. It works great. Rsync is very solid - but many configuration options can confuse it, for example, to exclude all metadata files for CVS not only for backup for SVN, a - cvs-exclude Is the flag Very useful, but definitely not easily discoverable.

    The detailed description of all these options is in the rsync documentation.


Comments