1 |
On Wed, Sep 14, 2011 at 11:50 PM, Neil Bothwick <neil@××××××××××.uk> wrote: |
2 |
> On Wed, 14 Sep 2011 17:07:45 -0700, Mark Knecht wrote: |
3 |
> |
4 |
>> > There's not point in doing the fetch first, portage has done parallel |
5 |
>> > fetching for some time - it's faster to let the distfiles download |
6 |
>> > while the first package is compiling. |
7 |
>> > |
8 |
>> > emerge -auDN @world covers all of that - except the -j which is |
9 |
>> > system-dependent. |
10 |
> |
11 |
>> Quite true about the parallel fetch, but I still do this anyway |
12 |
>> because I want to know all the code is local before I start. With 12 |
13 |
>> processor cores I often build the first file before the second has |
14 |
>> been downloaded. Also I don't want to start a big build, say 50-70 |
15 |
>> updates, and then find out an hour later when I come back that some |
16 |
>> portage mirror choked on finding a specific file and the whole thing |
17 |
>> died 10 minutes in. This way I have a better chance of getting to the |
18 |
>> end in one pass. |
19 |
> |
20 |
> --keep-going will take care of that, and making sure there are for F |
21 |
> flags in the --ask output before hitting Y. |
22 |
> |
23 |
>> Anyway, it works well for this old dog, and in my mind there is a good |
24 |
>> reason to fetch before building but I can see how others might not |
25 |
>> want to do that. |
26 |
> |
27 |
> I use it too, but for a different reason. I run emerge --sync from a cron |
28 |
> script, followed by emerge -f world, so all the tarballs are downloaded |
29 |
> before I even start. |
30 |
> |
31 |
|
32 |
OK, sorry for offering my opinion. I'll just go away an not bother anyone. |
33 |
|
34 |
Bye |