1 |
On Monday, 27 September 2021 14:30:36 BST Peter Humphrey wrote: |
2 |
> On Monday, 27 September 2021 02:39:19 BST Adam Carter wrote: |
3 |
> > On Sun, Sep 26, 2021 at 8:57 PM Peter Humphrey |
4 |
<peter@××××××××××××.uk> |
5 |
> > |
6 |
> > wrote: |
7 |
> > > Hello list, |
8 |
> > > |
9 |
> > > I have an external USB-3 drive with various system backups. There are |
10 |
> > > 350 |
11 |
> > > .tar files (not .tar.gz etc.), amounting to 2.5TB. I was sure I wouldn't |
12 |
> > > need to compress them, so I didn't, but now I think I'm going to have |
13 |
> > > to. |
14 |
> > > Is there a reasonably efficient way to do this? |
15 |
> > |
16 |
> > find <mountpoint> -name \*tar -exec zstd -TN {} \; |
17 |
> > |
18 |
> > Where N is the number of cores you want to allocate. zstd -T0 (or just |
19 |
> > zstdmt) if you want to use all the available cores. I use zstd for |
20 |
> > everything now as it's as good as or better than all the others in the |
21 |
> > general case. |
22 |
> > |
23 |
> > Parallel means it uses more than one core, so on a modern machine it is |
24 |
> > much faster. |
25 |
> |
26 |
> Thanks to all who've helped. I can't avoid feeling, though, that the main |
27 |
> bottleneck has been missed: that I have to read and write on a USB-3 drive. |
28 |
> It's just taken 23 minutes to copy the current system backup from USB-3 to |
29 |
> SATA SSD: 108GB in 8 .tar files. |
30 |
|
31 |
I was premature. In contrast to the 23 minutes to copy the files from USB-3 to |
32 |
internal SSD, zstd -T0 took 3:22 to compress them onto another internal SSD. I |
33 |
watched /bin/top and didn't see more than 250% CPU (this is a 24-CPU box) with |
34 |
next-to-nothing else running. The result was 65G of .tar.zst files. |
35 |
|
36 |
So, at negligible cost in CPU load*, I can achieve a 40% saving in space. Of |
37 |
course, I'll have to manage the process myself, and I still have to copy the |
38 |
compressed files back to USB-3 - but then I am retired, so what else do I have |
39 |
to do? :) |
40 |
|
41 |
Thanks again, all who've helped. |
42 |
|
43 |
* ...so I can continue running my 5 BOINC projects at the same time. |
44 |
|
45 |
-- |
46 |
Regards, |
47 |
Peter. |