How do you get a lot of scp files on the hosting?
I'm using a virtual host, and I need to pump a lot of files to the host. Let's just say I'm on the other host of a big web forum, which has a lot of attachments to its users.
The problem is that hard-wall programmes are prohibited on the side of the hostel (they are fighting viruses and spamers, apparently): any process will be killed in about 100 seconds. For php/perl scrips, it's not a problem, but
scpcopying many files will also be killed. For the same reason, I can't archive files on the current host, reschedule the archives and archive them on the hostel.
How do you get past this situation?
azP♪ I mean, instead.
scp -r ./local-dir user@remote-host:~/remote-dir
rsync -azP ./local-dir user@remote-host:~/remote-dir
rsyncwill pass the files on the same encrypted sshkanal, but when the compound is cut (if killed by the host side) when this crew is re-opened, "smart"
rsyncKeep copying from where he's staying.
Which means the proposed flags:
-a(the same as that)
rsync- it's alas for flags.
-rlptgoD- I mean, it's recurring to transfer the directory and everything that's invested in it, keeping the simlins, top-down files, file times, ovners and file groups and directors, and correctly processing files and other exotics.
-z- Compress the transferred files over the summer (in case the data are predominantly textual, if the binary data are not available)
-P- To allow the delivery of partial downloaded files when the team is relaunched (which is the main solution to the issue of the pitch)
After rupture of the transmission (broken pipe start the same team--
rsyncI'll keep it from where it was.
And in order not to let the team start with its hands every time, you can start it in the cycle:
while [ $? -ne 0 ]; do rsync <params> ; done
The cycle will repeat the launch.
rsyncAs long as it doesn't end with zero status (i.e.