1 |
On Friday 19 January 2007 18:18, Jens Kubieziel wrote: |
2 |
> Hi, |
3 |
> |
4 |
> a friend of mine wants to use Gentoo, but has a poor internet |
5 |
> connection. We are thinking about a convenient way to get packages. |
6 |
> We thought about redefining $FETCHCOMMAND to something like |
7 |
> 'FETCHCOMMAND="echo ${URI} |
8 |
> |
9 |
> > package.file'. But that (and also other tries) did not work. What |
10 |
> > is |
11 |
> |
12 |
> the best way |
13 |
> to get a file of download-URLs to feed to wget? |
14 |
> |
15 |
> Thanks for any recommendations |
16 |
|
17 |
FETCHCOMMAND is what portage uses to fetch stuff. Once it has run, |
18 |
portage expects $stuff to be there. What you have done is made sure the |
19 |
portage won't try to download it at all, so of course it won't work. |
20 |
|
21 |
I normally run 'emerge -pvf' to get a list of URIs to download, then |
22 |
bash it into shape to get a text file listing, and send that to wget -i |
23 |
|
24 |
Something like: |
25 |
|
26 |
emerge -pvf world > emerge.lst |
27 |
cat emerge.lst | cut -f1 -d' ' | sort | uniq > emerge.1.lst |
28 |
[inspect emerge.1.lst manually and remove cruft] |
29 |
wget -i emerge.1.lst |
30 |
|
31 |
This can of course be improved tremendously. It only tries the first URI |
32 |
for any given file (because of the cut), and it always attempts to |
33 |
download every file for every package to be merged (as I haven't found |
34 |
an easy way to get just a list of stuff not in distfiles) |
35 |
|
36 |
I'm sure you could do a better job with a bit of work, this just happens |
37 |
to suit my particular needs |
38 |
|
39 |
alan |
40 |
|
41 |
-- |
42 |
gentoo-user@g.o mailing list |