1 |
Am 04.10.2012 12:37, schrieb Neil Bothwick: |
2 |
> On Thu, 04 Oct 2012 10:25:53 +0200, Michael Hampicke wrote: |
3 |
> |
4 |
>> I am using http://backuppc.sourceforge.net/ (in portage) which brings |
5 |
>> some nice features like compression, de-duplication and a web interface. |
6 |
>> Once configured it runs automatically. |
7 |
> |
8 |
> +1 for BackupPC |
9 |
> |
10 |
>> What do you use for cloud/offsite backups? I am still searching for the |
11 |
>> perfect solution. Requirements: backups must be encrypted, delta-sync, |
12 |
>> and being able to resume interrupted transfers to the offsite location. |
13 |
> |
14 |
> I have a Python script that uses a combination of dar, |
15 |
> to create encrypted backups locally, and boto to upload them to S3. I used |
16 |
> duplicity some years ago and found it consumed enormous amounts of |
17 |
> bandwidth, more than my ISP provided at the time. |
18 |
|
19 |
Hm, dar looks interesting. I'll have a look at it. The man page states |
20 |
that it is possible to restore individual files from an dar archive |
21 |
without reading the complete file (in contrast to tar). Is this also |
22 |
true when using compression and/or encryption? Would be a great feature |
23 |
for fast single file restores (Just mount the offsite location with |
24 |
sshfs or similar and tell dar to restore file25 from my 500GB backup |
25 |
without having to transfer the whole damn thing :) ) |