1 |
On Thu, 2008-12-04 at 07:10 +0000, Mick wrote: |
2 |
> Almost every time I split a large file >1G into say 200k chunks, then ftp it |
3 |
> to a server and then: |
4 |
|
5 |
That's thousands of files! Have you gone mad?! |
6 |
|
7 |
> |
8 |
> cat 1 2 3 4 5 6 7 > completefile ; md5sum -c completefile |
9 |
|
10 |
> if fails. Checking the split files in turn I often find 1 or two chunks that |
11 |
> fail on their own md5 checks. Despite that the concatenated file often works |
12 |
> (e.g. if it is a video file it'll play alright). |
13 |
|
14 |
Let me understand this. Are [1..7] the split files or the checksums of |
15 |
the split files? If the former then 'md5sum -c completefile' will fail |
16 |
with "no properly formatted MD5 checksum lines found" or similar due to |
17 |
the fact that "completefile" is not a list of checksums. If the latter, |
18 |
then how are you generating [1..7]? If you are using the split(1) |
19 |
command to split the files and are not passing at least "-a 3" to it |
20 |
then your file is going to be truncated do to the fact that the suffix |
21 |
length is too small to accommodate the thousands of files needed to |
22 |
split a 1GB+ file into 200k chunks. You should get an error like "split: |
23 |
Output file suffixes exhausted." |
24 |
|
25 |
Maybe if you give the exact commands used I might understand this |
26 |
better. |
27 |
|
28 |
I have a feeling that this is not the most efficient method of file |
29 |
transfer. |