1 |
On Mon, Jun 20, 2011 at 10:48 AM, Allan Gottlieb <gottlieb@×××.edu> wrote: |
2 |
> On Mon, Jun 20 2011, Paul Hartman wrote: |
3 |
> |
4 |
>> On Mon, Jun 20, 2011 at 10:25 AM, Mark Knecht <markknecht@×××××.com> wrote: |
5 |
>>> Hi, |
6 |
>>> Is split an appropriate program to use to break a single 10GB file |
7 |
>>> into 100 100MB files to transfer over the net using rsync, and then |
8 |
>>> use cat to reassemble? |
9 |
>> |
10 |
>> I think it should work just fine. I've split huge files into huge |
11 |
>> chunks and never had any issues. |
12 |
>> |
13 |
>>> Is there some better way to do this? |
14 |
>> |
15 |
>> I wonder if splitting is even necessary; rsync will analyze the file |
16 |
>> and only transmit the differences, right?. So I'd think that even if |
17 |
>> the transfer fails, a retry would pick up where it left off (assuming |
18 |
>> rsync keeps the failed copy). |
19 |
> |
20 |
> I believe that is the --partial option. |
21 |
> |
22 |
> allan |
23 |
|
24 |
Yes, that looks like what I want. |
25 |
|
26 |
Is there an option to have rsync keep trying if the other end goes |
27 |
down for a while or would I need to put the rsync command into a cron |
28 |
job so that it restarts every hour until it's completed the transfers? |
29 |
I don't see one scanning through the many, many options to rsync. |
30 |
|
31 |
Thanks, |
32 |
Mark |