1 |
Am 08.01.2013 18:35, schrieb Volker Armin Hemmann: |
2 |
> Am Dienstag, 8. Januar 2013, 08:27:51 schrieb Florian Philipp: |
3 |
>> Am 08.01.2013 00:20, schrieb Alan McKinnon: |
4 |
>>> On Mon, 07 Jan 2013 21:11:35 +0100 |
5 |
>>> |
6 |
>>> Florian Philipp <lists@×××××××××××.net> wrote: |
7 |
>>>> Hi list! |
8 |
>>>> |
9 |
>>>> I have a use case where I am seriously concerned about bit rot [1] |
10 |
>>>> and I thought it might be a good idea to start looking for it in my |
11 |
>>>> own private stuff, too. |
12 |
>> |
13 |
>> [...] |
14 |
>> |
15 |
>>>> [1] http://en.wikipedia.org/wiki/Bit_rot |
16 |
[...] |
17 |
>>> If you mean disk file corruption, then doing it file by file is a |
18 |
>>> colossal waste of time IMNSHO. You likely have >1,000,000 files. Are |
19 |
>>> you really going to md5sum each one daily? Really? |
20 |
>> |
21 |
>> Well, not daily but often enough that I likely still have a valid copy |
22 |
>> as a backup. |
23 |
> |
24 |
> and who guarantees that the backup is the correct file? |
25 |
> |
26 |
|
27 |
That's why I wanted to store md5sum (or sha2sums). |
28 |
|
29 |
> btw, the solution is zfs and weekly scrub runs. |
30 |
> |
31 |
|
32 |
Seems so. |