1 |
In addition to this, you may want to use the parallel implementations of |
2 |
"gzip", "xz", "bzip2" or the new "zstd" (zstandard), which are |
3 |
"pigz"[1], "pixz"[2], "pbzip2"[3], or "zstmt" (within package |
4 |
"app-arch/zstd")[4] in order to increase performance: |
5 |
|
6 |
$ cd <path_to_mounted_backup_partition> |
7 |
$ for tar_archive in *.tar; do pixz "${tar_archive}"; done |
8 |
|
9 |
-Ramon |
10 |
|
11 |
[1] |
12 |
* https://www.zlib.net/pigz/ |
13 |
|
14 |
[2] |
15 |
* https://github.com/vasi/pixz |
16 |
|
17 |
[3] |
18 |
* https://launchpad.net/pbzip2 |
19 |
* http://compression.ca/pbzip2/ |
20 |
|
21 |
[4] |
22 |
* https://facebook.github.io/zstd/ |
23 |
|
24 |
|
25 |
On 26/09/2021 13:36, Simon Thelen wrote: |
26 |
> [2021-09-26 11:57] Peter Humphrey <peter@××××××××××××.uk> |
27 |
>> part text/plain 382 |
28 |
>> Hello list, |
29 |
> Hi, |
30 |
> |
31 |
>> I have an external USB-3 drive with various system backups. There are 350 .tar |
32 |
>> files (not .tar.gz etc.), amounting to 2.5TB. I was sure I wouldn't need to |
33 |
>> compress them, so I didn't, but now I think I'm going to have to. Is there a |
34 |
>> reasonably efficient way to do this? I have 500GB spare space on /dev/sda, and |
35 |
>> the machine runs constantly. |
36 |
> Pick your favorite of gzip, bzip2, xz or lzip (I recommend lzip) and |
37 |
> then: |
38 |
> mount USB-3 /mnt; cd /mnt; lzip * |
39 |
> |
40 |
> The archiver you chose will compress the file and add the appropriate |
41 |
> extension all on its own and tar will use that (and the file magic) to |
42 |
> find the appropriate decompresser when you want to extract files later |
43 |
> (you can use `tar tf' to test if you want). |
44 |
> |
45 |
> -- |
46 |
> Simon Thelen |
47 |
> |
48 |
|
49 |
-- |
50 |
GPG public key: 5983 98DA 5F4D A464 38FD CF87 155B E264 13E6 99BF |