1 |
On Wed, 14 Sep 2011 17:07:45 -0700, Mark Knecht wrote: |
2 |
|
3 |
> > There's not point in doing the fetch first, portage has done parallel |
4 |
> > fetching for some time - it's faster to let the distfiles download |
5 |
> > while the first package is compiling. |
6 |
> > |
7 |
> > emerge -auDN @world covers all of that - except the -j which is |
8 |
> > system-dependent. |
9 |
|
10 |
> Quite true about the parallel fetch, but I still do this anyway |
11 |
> because I want to know all the code is local before I start. With 12 |
12 |
> processor cores I often build the first file before the second has |
13 |
> been downloaded. Also I don't want to start a big build, say 50-70 |
14 |
> updates, and then find out an hour later when I come back that some |
15 |
> portage mirror choked on finding a specific file and the whole thing |
16 |
> died 10 minutes in. This way I have a better chance of getting to the |
17 |
> end in one pass. |
18 |
|
19 |
--keep-going will take care of that, and making sure there are for F |
20 |
flags in the --ask output before hitting Y. |
21 |
|
22 |
> Anyway, it works well for this old dog, and in my mind there is a good |
23 |
> reason to fetch before building but I can see how others might not |
24 |
> want to do that. |
25 |
|
26 |
I use it too, but for a different reason. I run emerge --sync from a cron |
27 |
script, followed by emerge -f world, so all the tarballs are downloaded |
28 |
before I even start. |
29 |
|
30 |
|
31 |
-- |
32 |
Neil Bothwick |
33 |
|
34 |
Master of all I survey (at the moment, empty pizza boxes) |