1 |
Ulrich Mueller <ulm@g.o> wrote: |
2 |
> |
3 |
> I think the conclusion is that github generates tarballs on the fly, |
4 |
> and therefore we cannot rely on them being invariant over a long time. |
5 |
|
6 |
Yes, but with emphasis on _long_ time and theory. |
7 |
In practice this was happening now exactly _once_ in a decade |
8 |
(according to all we learnt so far) for the understandable |
9 |
reason of fixing an annoying incompatibility in an exotic case. |
10 |
And the existence of zopfli shows that other backward-compatible |
11 |
improvements _would_ have been possible, but apparently non-changing |
12 |
of the produced tarball was always rated higher than anything else |
13 |
(up to this exception). |
14 |
|
15 |
So I would not worry too much about it: It is not worth the cost of |
16 |
hosting a huge number of tarballs permanently (or to convince |
17 |
upstream to let them be hosted by github for every single version, |
18 |
only because one cannot theoretically exclude that a similar thing |
19 |
won't ever happen again). Yes, for the transition period (until all |
20 |
github servers use a new enough version) a solution for the few involved |
21 |
tarballs has to be found (like temporarily hosting on devspace). |
22 |
But after this period it is only a question of updating the |
23 |
checksum once for the involved packages. |