1 |
Martin Vaeth <martin <at> mvath.de> writes: |
2 |
|
3 |
> |
4 |
> James <wireless <at> tampabay.rr.com> wrote: |
5 |
> > |
6 |
> > Basically from my point of view, something like TUP [1] is needed so |
7 |
> > that at dependency check time you only list files that need |
8 |
> > attention (linking, loading, compiling etc) thus speeding up the |
9 |
> > update processes for the Package Manager (portage). |
10 |
> |
11 |
> This is a misunderstanding (originally already from Michael). |
12 |
> The issue is not at all about speed of portage - reading one or |
13 |
> the other file takes the same amount of time. |
14 |
> (And having a dependency graph of files would help concerning |
15 |
> speed only in the very rare situation that no file involving |
16 |
> your current package dependency tree does change.) |
17 |
|
18 |
One aspect of the problem, but you are partially correct. |
19 |
|
20 |
> The whole issue is only about the policy: If the dependency |
21 |
> information of the installed package and of the current |
22 |
> package differs - what is the "correct" information? |
23 |
|
24 |
DAG's, from 'graph theory' are often used to form the basis of more |
25 |
advanced tool, including dependencies between the files, inlcuding |
26 |
date changes of last compile/touch times. |
27 |
|
28 |
|
29 |
> For each choice, it is possible to give examples where |
30 |
> the policy leads to a bad situation, so both, |
31 |
> static deps as well as dynamic deps can break in |
32 |
> certain situations. It is only a question which breakage |
33 |
> you consider more severe and/or harder to recognize/fix |
34 |
> by the user. |
35 |
|
36 |
Exactly. The current tools are insufficient, because the data |
37 |
necessary to build sufficient tools with ever evolving logic constraints |
38 |
form a myriad of system requirements, is insufficient; hence what Blueness |
39 |
refers to as a "Direcent linkages" is a a subset of what a full blown DAG |
40 |
can do. I use TUP et. al. as examples where a DAG is use to minimize the |
41 |
item listing for re-compile. But that is by no means the only usage of a DAG |
42 |
or similar structural enhancement to the tools that can be built. |
43 |
|
44 |
|
45 |
> Making more revbumps will increase the chance of |
46 |
> no breakage - in case of static deps only for users |
47 |
> who sync and update sufficiently frequently - |
48 |
> of course at the cost of redundant recompiles. |
49 |
|
50 |
Yes, DAGs can track timestamps and many other forms of data, and organize it |
51 |
to for easy parsing. |
52 |
|
53 |
|
54 |
|
55 |
> >> I guess at some point there were a bunch of devs who were messing with |
56 |
> >> dependencies and not bothering to make revision bumps. This can cause |
57 |
> >> users pain, so portage added a new option to ignore the cache and rescan |
58 |
> >> every single relevant ebuild/eclass for sneaky dependency changes. This |
59 |
> >> ensures that portage has the correct dependencies in its head while it's |
60 |
> >> doing resolution, but is much slower. |
61 |
|
62 |
True, but you are looking antedotally, not with the lens of developing tool |
63 |
that keep systems, robustly pristine; hence the struggles with package |
64 |
management. |
65 |
|
66 |
|
67 |
> This is historically not correct. Dynamic deps had always been |
68 |
> the only way portage treated dependencies - static deps have only |
69 |
> been used as a fallback and (unfortunately) with the introduction |
70 |
> of subslots. |
71 |
> Once more: It is not about speed, but about: What *are* the |
72 |
> "correct" dependencies? The ones referring to an historical tree |
73 |
> which - especially if you did not update for a long time - |
74 |
> might have almost nothing in common with the current tree |
75 |
> (static deps), or the ones referring to current tree |
76 |
> (dynamic deps)? |
77 |
|
78 |
Corrent. But with a DAG approach, you can develop tools that solve these |
79 |
and newer, unforseen problems. A DAG or a collection of DAGS can lead to a |
80 |
fully characterized system, where every file, and it's inter-related |
81 |
relationships, is tracked. This approach will allow those interested the |
82 |
ability to develop system tools for ensuring a system is and remains, |
83 |
pristine. Object oriented paradyms result in lots of "slop" and cruft |
84 |
on a system. |
85 |
|
86 |
|
87 |
> With static deps, you will have a strange mixture of historical |
88 |
> dependencies and current ones for the updates. |
89 |
> With dynamic deps, the tree might not be appropriate for your |
90 |
> installed packages. |
91 |
|
92 |
Ever look closing, manually parsing up and down the directory structure of |
93 |
an older gentoo install, say a few years? Cruft abounds. Cruft leads to the |
94 |
'dark side' of computing. *every file* should be fully accounted for, have |
95 |
it's relationships mapped and documented, including timestamps. |
96 |
That one reason why commercial, embedded product systems, just run for years |
97 |
and years, without intervention. |
98 |
|
99 |
> > There is no proper mechanism to accurately track all of these issue, |
100 |
> > currently, or did I miss this point? |
101 |
> |
102 |
> There is no way to automatically decide correctly which of |
103 |
> two differing informations should be used... |
104 |
|
105 |
|
106 |
Correct, so you build DAGs for data, then parse it, statistically or |
107 |
manually and discern problems. DAGs are fundamental and probably the |
108 |
singular most important thing you learn in one of your most important |
109 |
computer science undergraduate course in graph theory. Computers are all |
110 |
about repeatability and assurances to accuracy. DAGs are a fundamental tool |
111 |
for these sorts of scenarios. |
112 |
|
113 |
|
114 |
|
115 |
hth, |
116 |
James |