1 |
On Jan 3, 2012 4:55 AM, "James Broadhead" <jamesbroadhead@×××××.com> wrote: |
2 |
> |
3 |
> I have a pile of files, and a personal svn repo totalling around 13GiB |
4 |
> which I want to back up to cheaply to 'the cloud'. I would also like |
5 |
> it to be non-trivial for someone with access to the cloud servers to |
6 |
> decrypt my data. |
7 |
> |
8 |
> I have a 50GB free account for Box.net, but would consider others if |
9 |
> they have significant advantages. The box.net account is only allowed |
10 |
> upload files of max 100MiB at a time. |
11 |
> |
12 |
> Now one problem facing me is that most cloud services don't give |
13 |
> assurances of bit parity, so I'd like to be able to recover most of |
14 |
> the files if I lost my local copies and there were bits missing from |
15 |
> the uploaded backup. This makes the one-big-encrypted-file approach a |
16 |
> no-go. |
17 |
> |
18 |
> My current approach is to use split-tar, with the intention of |
19 |
> encrypting each file separately. (Is this worse / equivalent to having |
20 |
> one big file with ECB ? ) |
21 |
> http://www.informatik-vollmer.de/software/split-tar.php |
22 |
> ...but this seems to have difficulty sticking below the 100MiB |
23 |
> individual file limit (possibly there are too many large files in the |
24 |
> svn history). |
25 |
> |
26 |
> Any thoughts? I'm sure that many of you face this problem. |
27 |
> |
28 |
|
29 |
Make tarball. |
30 |
|
31 |
Encrypt. |
32 |
|
33 |
Split using split. |
34 |
|
35 |
Protect with par2 using the -l option to limit size. |
36 |
|
37 |
Upload. |
38 |
|
39 |
Rgds, |