1 |
On Monday 20 June 2011, Paul Hartman wrote: |
2 |
> On Mon, Jun 20, 2011 at 10:25 AM, Mark Knecht <markknecht@×××××.com> |
3 |
wrote: |
4 |
> > Hi, |
5 |
> > Is split an appropriate program to use to break a single 10GB |
6 |
> > file into 100 100MB files to transfer over the net using rsync, |
7 |
> > and then use cat to reassemble? |
8 |
> |
9 |
> I think it should work just fine. I've split huge files into huge |
10 |
> chunks and never had any issues. |
11 |
> |
12 |
> > Is there some better way to do this? |
13 |
> |
14 |
> I wonder if splitting is even necessary; rsync will analyze the file |
15 |
> and only transmit the differences, right?. So I'd think that even if |
16 |
> the transfer fails, a retry would pick up where it left off (assuming |
17 |
> rsync keeps the failed copy). |
18 |
> |
19 |
> Also check out net-misc/unison. It seems to be designed for just this |
20 |
> sort of thing. |
21 |
|
22 |
Unison is wonderful for more complex tasks but is very inefficient with |
23 |
large files. As a matter of fact it uses rsync algorithm in order to get |
24 |
good performance, still isn't the best choice in this scenario, see: |
25 |
http://www.cis.upenn.edu/~bcpierce/unison/download/releases/stable/unison- |
26 |
manual.html#speeding |
27 |
|
28 |
I keep in sync the user home across two computer, and I use both of them |
29 |
daily. It would be impossible without unison, but the 20GB virtual |
30 |
machine is excluded from the sync. |
31 |
|
32 |
Cheers |
33 |
Francesco |
34 |
|
35 |
-- |
36 |
Linux Version 2.6.39-gentoo-r1, Compiled #1 SMP PREEMPT Thu Jun 9 |
37 |
11:20:57 CEST 2011 |
38 |
Two 1GHz AMD Athlon 64 X2 Processors, 4GB RAM, 4021.84 Bogomips Total |
39 |
aemaeth |