1 |
jos houtman wrote: |
2 |
|
3 |
> I got to backup a huge changing/growing collection of files (~15 |
4 |
> million/~6TB) and the existing rsync solution just doenst cut it any more. |
5 |
> Below I will explain the situation, what the problem is and my current |
6 |
> train of thought. Where you people can help is providing me with a sound |
7 |
> bord, and hopefully some out-of-the-box thinking :D. |
8 |
|
9 |
Certainly every file is not linked directly in a web page? |
10 |
|
11 |
Why not keep links in a database that point to the correct location on disk |
12 |
for the images themselves? Then all you need to do is query the database |
13 |
for a timestamp field that has changed and you know what you need to back |
14 |
up, and at any time you can move the underlying files around and update the |
15 |
links in the database... |
16 |
|
17 |
-- |
18 |
gentoo-server@g.o mailing list |