1 |
On Sun, Sep 26, 2021 at 8:57 PM Peter Humphrey <peter@××××××××××××.uk> |
2 |
wrote: |
3 |
|
4 |
> Hello list, |
5 |
> |
6 |
> I have an external USB-3 drive with various system backups. There are 350 |
7 |
> .tar |
8 |
> files (not .tar.gz etc.), amounting to 2.5TB. I was sure I wouldn't need |
9 |
> to |
10 |
> compress them, so I didn't, but now I think I'm going to have to. Is there |
11 |
> a |
12 |
> reasonably efficient way to do this? |
13 |
> |
14 |
|
15 |
find <mountpoint> -name \*tar -exec zstd -TN {} \; |
16 |
|
17 |
Where N is the number of cores you want to allocate. zstd -T0 (or just |
18 |
zstdmt) if you want to use all the available cores. I use zstd for |
19 |
everything now as it's as good as or better than all the others in the |
20 |
general case. |
21 |
|
22 |
Parallel means it uses more than one core, so on a modern machine it is |
23 |
much faster. |